Thursday, December 27, 2012

String.Replace and FIM EmailTemplates

Had this awesome idea the other day and totally got foiled by my FIM box.  Thought I’d be crafty and generate some fancy HTML for an email notification, then I’d use my Action Workflow to store that HTML snippet in the FIM workflow dictionary.  Next I’d consume it in my Notification Activity by simply referred to the workflow dictionary like this:


Would have totally worked too, but FIM pulled a fast one on me, and did some encoding on the data I stuffed into the dictionary, so by the time it was passed to the EmailTemplate it no longer resembled HTML, and of course didn’t render.

EmailTemplates and the WorkflowDictionary: 0

PowerShell WF Activity and Send-MailMessage: 1

Since I already was using a PowerShell WF activity to construct the HTML snippet, instead of passing the HTML snippet to the workflow dictionary, I just sent the message from PowerShell using Send-MailMessage. 

The best issues to run into are the ones where YOU can craft a solution.  In this case I can’t complain too much because I still arrived at a solution and it was actually easier to decorate with logging and exception handling.

Learning About FIM Authentication Activities

The FIM Service’s policy engine has three types of workflows:

  • Authentication
  • Authorization
  • Action

Most of the work I’ve done with FIM has used Authorization and Action workflows which are relatively straight forward (especially when you use PowerShell!).  Until recently I’ve respected that AuthN workflows were mostly there to enable FIM’s Self-Service Password Reset functionality, and marveled at Jeremy and Ikrima when they demonstrated custom AuthN WF solutions at TEC (RIP TEC BTW, so sad). 

Recently I’ve been lucky enough to have an opportunity to do some prototyping work for a design I’m working on, and the thing I have to prove is that AuthN workflows in FIM can handle the manner in which I plan to abuse them.  This prototyping has been a ton of fun because AuthN workflows are SO different than the other workflow types but are still hosted and managed by FIM.  The fall-back would be to use a custom service with a backing store, which I really prefer to avoid because it introduces more moving parts which then have to be automated, tested and managed.  So the added complexity of AuthN WF can be justified.

Over the coming weeks I expect to post more about this, including the PowerShell scripts I’ve been using to automate the setup and testing of the prototype.  

Thursday, December 20, 2012

Downloading Files from TFS

There are many ways to deploy FIM, but I always try to start deployments from files that are version controlled by TFS.  Once you’ve stored your deployment scripts in TFS, you need to get them onto the FIM computer.  I typically do this by copying files, but have been thinking of trying a different approach – taking the files directly from TFS.

Turns out you don’t need Visual Studio installed to get files from TFS version control, just a single DLL.  With this approach you can have a small script on the server download the file from TFS directly onto the FIM computer.

Here’s the script snippet showing how to do it:


### I got this file from: C:\Program Files (x86)\Microsoft Visual Studio 11.0\Common7\IDE\ReferenceAssemblies\v2.0

### NOTE: visual Studio does not need to be installed where this runs...


add-type -Path C:\TFSTest\Microsoft.TeamFoundation.VersionControl.Client.dll



### Connect to TFS Version Control


$TFS = [Microsoft.TeamFoundation.Client.TeamFoundationServerFactory]::GetServer('http://myTfsServer:8080/tfs/FIM')

$VersionControlServer = $TFS.GetService([Microsoft.TeamFoundation.VersionControl.Client.VersionControlServer])



### Get an item from Source Control and download it


$TfsItem = $VersionControlServer.GetItem('$/MyFimProject/DeploymentScripts/DeployFimConfig.ps1')




Wednesday, December 12, 2012

Compliance Doesn’t Have to be Costly or Complex

A few years ago when I joined Edgile I was enticed by a number of positive attributes about the company but one of them was simply that there were multiple businesses.  FIM deployments represent a healthy business for Edgile (that’s where I play) but another budding gem is the iGRC business.  The folks running that program are heavily experienced in GRC and have produced a great product that is already well received by its customers.  As it turns out the businesses are sometimes complimentary, but they do not depend on each other at all.  As an employee it’s a neat opportunity to get to work on a different product, but the demand for FIM isn’t such that I’ll be making change anytime soon!

Tuesday, December 11, 2012

Dell's acquisitions not yet paying dividends

Still waiting for official news about the TEC conference, but the rumours are not promising.  In the interim it appears Dell's acquisitions are not yet paying dividends which I’m not surprised or upset by.  My secret wish is that Dell continues to pick up Systems Integration firms so the one that I own stock in gets a big bump!  My other secret wish was that Dell would continue the TEC conference…

Wednesday, December 05, 2012

ISmsServiceProvider.SendSms Method

OK, I’m an MSDN troll, I’m always looking in there for new toys.  Found this one a little while ago but haven’t had time to work with it:

ISmsServiceProvider.SendSms Method (Microsoft.IdentityManagement.SmsServiceProvider)

The pages appear generated from the assemblies so have no real human explanations for what this class does or how to use it.  I expect we’ll see more detail as the doc team catches up with R2.

It is only worth talking about right now because of the recent Phone Factor acquisition by Microsoft, and the fact that I need to implement some SMS functionality in an upcoming project.  Hopefully by then I’ll have more to share than just an MSDN page for a class in FIM 2010 R2 I haven’t used yet!

Wednesday, November 28, 2012

System Center is About to Get a Whole Lot Cooler

Listening to the PowerScripting Podcast recently (Episode 205 – Jeffrey Snover talks about PowerShell 3.0) and learned that Jeffrey Snover is now also a lead architect for System Center.  My first impression was bummed because if that guy’s bandwidth were up for grabs then I wish he’d mosey on over to the FIM team for a visit.  PowerShell and FIM are both strong integration engines, it would be really cool to see a mash-up of those two engines.  Anyhow, I look forward to learning more about System Center now, and the impact he’ll have over there. 

Saturday, November 24, 2012

Selling Test Automation for FIM

For a long time I’ve been convinced that Test Automation (and related automations) are very difficult to sell as scope in a FIM deployment engagement.  It seems like a tax that most people are happy to dodge where in reality it is a bet that people unknowingly make.  That bet is: I bet if I skip test automation then my deployment will still be OK.  The reality is that you are going to spend the time and money proactively doing test automation, or you are going to spend the time reactively scrambling to repro and resolve bugs that your customers find for you.

Anyhow, I was looking at the sponsors of the OCG event in January and stumbled upon an interesting one:

The interesting thing about Software IDM is that they are actually selling test automation for FIM.  This is very interesting because some great FIM ISV products have died on the vine (NetPro Mission Control for example) but this is a product that should be difficult to sell.  Still I’m really fascinated to see somebody try to sell it, and can’t wait to learn more about it.

For the record, I think that EVERY deployment of FIM should have a high percentage of test coverage in the form of test automation.  Test Automation for FIM is NOT easy, but it is the only way to measure quality before pushing changes into production.

Thursday, November 22, 2012

I’m Speaking!

OCG is organizing a nice event called the Redmond Identity, Access and Directory Knowledge Summit 2013.

There is a really good collection of speakers lined up, and luckily it is right in my backyard.  Even luckier, I get to speak at the event: FIM PowerShell Session

My session is the last of the day, which works great for me because if anybody wants to talk more about FIM or PowerShell then I will happily keep talking when plied with pints over the Evening Entertainment!

FIM 2012?

Sometimes I wonder if I spend too much inside machines, because I looked up today and saw a slide deck indicating we’ll see Forefront Identity Manager 2012, in what has to be calendar year 2012.
Enterprise Software Roadmap for Microsoft Products 2011
Looking again after my second cup of coffee, the deck clearly says ‘2011’ so I think it is safe to say there may have been a release planned in 2012, which is now probably just FIM 2010 SP1.
The interesting thing is that Microsoft seems to have taken a page out of Apple’s consumer-oriented playbook whereby release details are shrouded in secrecy.  For Microsoft consumer products (Surface was a good example) that makes sense but I hope enterprise software products see more community involvement and less secrecy.

UPDATE: a friend pointed out to me that the deck likely came from Directions on Microsoft, and that the roadmaps posted may not map very accurately to the actual plans of the product group.  You can subscribe to Directions on Microsoft (not free) to view the December 2012 issue which has a fresh roadmap for FIM. SPOILER ALERT: it isn't drastically different than the slide deck linked above except that the dates have moved back a bit.

Wednesday, October 31, 2012

TEC Europe Review

Barcelona was a nice break from Seattle, except for the thunderstorms but those just gave me an excuse to camp in my hotel room and tweak my decks and demos.

The week before TEC I pulled the plug on my OData session because ‘experimental’ turned into ‘holy kittens this is way harder than I hoped’.  Gil was kind enough to let me switch this session for a session on ‘FIM Workflow Using PowerShell’.  The session went very well, and I even did a lot of demos which all worked quite well except for…

Beware the Spanish Wifi… the day before my first session I was sitting in the conference, connected to the conference wifi on my Win8 laptop.  I’ve been running Win8 for over a month and have been very happy with the stability.  Sitting there taking notes, suddenly I was graced with about a dozen Win8 blue screens.  Turned out to be a problem with the Hyper-V bridge adapter, something specific to that access point.  Disabling the bridge adapter was a good enough workaround.  Luckily the bug didn’t interrupt my sessions or fail any of my demos.  Phew!

My last session was the ‘FIM PowerShell Deep Dive’ and my intent was to re-deliver last year’s deck, then I realized I could do a LOT better.  Last year I pretty much told people about the scripts they could write.  This year I showcased a lot of the commands from the FIM PowerShell Module on CodePlex (  It turned into a huge demo-fest where I showed what the OOB commands do, then how to extend them with the FIM PowerShell Module.  Attendance wasn’t huge, but everybody stayed for the intended hour, and stuck around for another hour.  There was no session scheduled after mine, so I continued on through and kept doing demos.  It was actually a lot of fun, and 100% of what I showed in the demos is available on CodePlex.

There was an odd hole where the FIM product group used to be at this conference.  Maybe it was budgets or schedule mayhem, but I can’t believe it was intentional since this is such a great opportunity to rally the FIM field.

Dell held another event right after TEC in the same location, and you could tell it was more sales oriented because the food suddenly improved and there were a LOT more people wearing suits.  Looking forward to another TEC next year, and can’t wait to hear where it’s gonna be!

Wednesday, October 17, 2012

PhoneFactor Joins Microsoft, and FIM?

Yeah BHOLD was an interesting addition to FIM, but PhoneFactor is pretty exciting too!  I’ve watched Jeremy’s sessions at TEC, as well as demos by the guys at ActiveIDM with phone integration, and it is always very cool to see. 

PhoneFactor Joins Microsoft

You can see how this could be added to FIM for AuthN workflows (how cool would it be if this became part of the FIM Request Processor?), but you could also see how cool this would be as part of Win8 and Active Directory.

This would be a big win for IT Pros if FIM adopts PhoneFactor and makes this integration simple.

Thursday, October 11, 2012

Summits, Summits, Everywhere!

Pretty neat.  A new summit focused on Identity and Access has just been announced:

Redmond Identity, Access & Directory Knowledge Summit 2013, USA

The event is organized by Oxford Computer Group and will be in Redmond in January.  Oxford runs a similar event in the UK that has been quite successful, so it will be nice to have one on this side of the pond, and once again right in my backyard!

Endangered IT species No. 8: The Purple-Tufted Programmer (Codus cobolus)

Network World published this interesting slideshow.  Slide 9 introduces Endangered IT species No. 8: The Purple-Tufted Programmer (Codus cobolus).  The interesting part is this tidbit, which can easily be seen as Identity Management junkies:

IT pros who only hack code may quickly wind up on the wrong side of the evolutionary divide.

The advice for avoiding extinction is spot on:

Coders who want to survive need to expand their expertise and align their skills with the needs of the business, says StorageIO Group senior adviser Greg Schulz.

"Coders and script junkies need to also be integrators of business logic, cloud tools, and more, or they'll join the ranks of mainframers who are becoming extinct," he says.

If you’ve worked with FIM, you are already behaving like an integrator.  FIM is the glue that binds lots and lots of systems together, so Darwin may forgive your script junkie tendencies.  For now…

So Long, XMA

Per MSDN the XMA has been deprecated, after a long run of fun MA developments.  The new and improved framework, ECMA 2.0 is the way to develop connectors going forward.  Times have really changed since the XMA was first introduced, and while the new framework is way more complete, I sure had fun (and lost hair) banging on XMAs trying to extend Sync into other systems.

The new framework delivers features that make connector development more developer friendly, and a higher bar for IT Pros, but this IMHO is a good thing since the average IT Pro probably should not be developing connectors anyhow.  My opinion on that changes drastically when PowerShell is involved, which makes the PowerShell MA by Søren Granfeldt pretty exciting.

Sunday, October 07, 2012

PowerShell Summit North America 2013

I was a bit bummed to learn that The Experts Conference no longer had the PowerShell Deep Dive track.  Turns out I got lucky and the PowerShell Deep Dive morphed into a new community-owned and –operated event called the PowerShell Summit (lucky because I live near where the event is, and could ride my bike to attend it instead of flying).

The event will follow the same format as the PowerShell Deep Dive, whereby each session is just 35 minutes.  You can create an account at, then login to view the session proposals.  Session proposals will be voted on to determine who will be the next contestant on So You Think You Can Script.

Why am I excited about attending?  Every new thing I learn in PowerShell makes me better at my job.  Browsing the session proposals I can easily spot a dozen things I want to learn more about, in order to do my job better.  Actually I really just want to do my job FASTER so that I can hang out with my family, ride my bike, and play hockey, etc.

This looks to be an awesome event and I can’t wait to attend!

Friday, October 05, 2012

TEC 2012–I’m Speaking!

I’ve been honoured with the opportunity to speak at The Experts Conference in Barcelona!

This time I’m giving two talks:

FIM / PowerShell Deep Dive

Bring your big ears, I’m going to blast through what you need to know about PowerShell in order to make your FIM deployments better.  If you work with FIM, you NEED to come to this session.

FIM as an OData Endpoint

FIM is a powerful policy and workflow engine hiding behind a set of web services that are no picnic. 

PowerShell is a powerful automation and integration platform turning DevOps into reality.

Version 3.0 of PowerShell has just shipped and comes with a very interesting solution to FIM’s API challenges; the Management OData IIS Extension.  This feature enables IIS to expose your management data (think FIM) accessed through PowerShell, as OData Web service entities.  So instead of querying the FIM web service for person objects, you could use URLs like this:

Pretty cool, and that is just the GET verbs.  You can also map to the other verbs to support Create, Update and  Delete.

I’m looking forward to getting my demos together for this one!

Sunday, September 30, 2012

Keep Calm and Learn PowerShell

This week I installed Windows 8 on my main work desktop.  I was mostly enticed by Hyper-V on the client.  VirtualBox has served me surprisingly well for the past few years, but the Oracle splash screen was getting too hard to swallow.

Holy Kittens!  Windows 8 was not an easy UI to love at first.  The learning curve shocked me, and the amount of disorientation was a bit embarrassing.  When I first started using Windows 8 I couldn’t even shutdown the machine, I had to open PowerShell to do so.

BTW – if you don’t already know PowerShell, yesterday was the time.  If you manage Windows computers, you MUST learn this.  Here’s a fun Lock Screen image to share this opinion (  Anyhow, on with the Win8 rant…

The move from iPhone to Windows Phone 7.5 took about a day to get over.  The move from Windows 7 to Windows 8 has taken me a few days, but I can forgive that learning curve because I do like the new OS, and enjoy the features it brings.  The shocking part for me is that I consider myself an experienced Windows user, and worry that the learning curve for the average Windows 8 user on computer with a keyboard and mouse will be, well, um, higher.  My hope is that people are delighted by the stability, features and improvements enough to forgive the learning curve this OS requires.  I know I am already happy, but even more excited to see what Surface feels like next month!

Wednesday, September 19, 2012

Group Expansion in the FIM Service

The FIM product is full of little features than can be fun to discover.  Just when you thought you knew it all…  Somebody asked me about using FIM Groups as owners of Groups, and I didn’t have the answer so I tried it.
#FAIL.  Turns out you cannot use the FIM Portal to add a Group as an owner of a Group. 
Then I dug a little deeper and tried it directly against the FIM Service (bypassing the FIM Portal, which is really just a client to the FIM Service).  The FIM Service happily accepted the request.  The script and resulting request are shown below.
### Add a Group as the Owner of another Group
New-FimImportObject -State Put -ObjectType Group -AnchorPairs @{DisplayName='testGroup828'} -Changes @(
    New-FimImportChange -Operation Add -AttributeName Owner -AttributeValue ('Group','DisplayName','GroupOwnersGroup')   
) -ApplyNow

The interesting thing about the request above is that it shows the type of object in the reference, indicating that we have added a Group to the Owner attribute.
The next thing I wondered is, what happens when the FIM Approval Activity processes a request where approval is required?  We know the Approval Activity refers to the Group owner attribute with XPath syntax, so what is it expecting when it finds a Group there instead of a Person?  Will it freak out and end this experiment?

### Add a user to the Group
New-FimImportObject -State Put -ObjectType Group -AnchorPairs @{DisplayName='testGroup828'} -Changes @(
    New-FimImportChange -Operation Add -AttributeName ExplicitMember -AttributeValue ('Person','DisplayName','newGroupMember1')   
) -ApplyNow
The result of this script is actually quite interesting.  The script just adds a member to a Group that requires owner-approval for new members.  This triggers an MPR that fires a Workflow with the Approval Activity.  The Group in question has two owners listed:
1. groupOwner1 (a person object)
2. GroupOwnersGroup (a group object)

The Approval object created by the Approval Activity shows that FIM is expanding the members of GroupOwnersGroup and placing them into the Approval as Approvers.  Pretty cool.  So the FIM Service expands groups for approval activities similar to how Exchange expands groups for mail delivery.

The moral of the story here is that the FIM Portal is really just a client to the FIM Service, and that scenarios may be closer to working than the portal will have you believe.  In this case, I bet the Group RCDC can be modified to allow you to select Groups as owners, and the whole thing will probably work.  Probably.  The pessimist might wonder why it isn’t this way already in the product?  Is there a bug lurking in there that my little experiment dodged somehow?  Time will tell.  I know I’ll be looking into this a little deeper and writing a few test cases for it.

Does it work with Nesting?

Somebody responded and asked if this still works with group nesting, so I tried that too.  I added another group with two members, then added the new group to the scenario above and the Approval Activity still expanded all the way to the nested group’s members.

Wednesday, September 12, 2012

FIM Ducks the Axe

Phew!  That was a close one!  Seems Microsoft pruned heavily in the Forefront bushes today:

Microsoft axes many of its Forefront enterprise security products

The FIM suite survived, which is good given the momentum added to it with the R2 release and BHOLD.  Also good because I enjoy being a FIM MVP and image they wouldn’t need many of those if FIM got the axe.

Tuesday, September 04, 2012

Does Code Coverage Really Matter for FIM Deployments?

There is tremendous value in treating a FIM deployment like software development project but there is a balance to strike between dev and ops (devops anyone?).  Keep in mind, I qualify myself as more ops than dev!

Code coverage is good at measuring how much of a project’s code is covered by tests.  FIM as a product aims to be highly declarative, in theory making the product easier to deploy without writing much code.   A FIM deployment (even one with lots of code) consists mostly of configuration.  Unfortunately there is no ‘configuration coverage’ tool for FIM, which makes it difficult to measure how much of a deployment’s configuration is covered by tests.

Code Coverage Doesn’t Provide Enough Value for the Typical FIM Deployment

My opinion is that code coverage by itself doesn’t provide enough value or raise quality enough for the average FIM deployment because most of the functionality is accomplished through configuration supported by some code.

Test Plans Become More Important

With or without code coverage, we’re supposed to have a nice little document that describes all of the tests we will perform on our deployment, and how we are going to do them.  The test plan needs to have tests for all of the functionality that is supposed to be in the solution.  The lack of code coverage puts more pressure on this document (and the resulting tests) to be more complete.  My suspicion here is that very few deployments see quality measured this way, but I bet most get a rubber stamp on testing!

Code Coverage for Finding Dead Code

A FIM deployment with lots of code has probably been produced and maintained by several developers (and even non-developers) over time.  Code coverage can be used in this scenario to simply show how much code is executed.  For example you can instrument your DLLs, deploy them to your server and let it run for a day or two then produce the code coverage report.  It will show code that is most likely no used, and can likely be removed.  Some of it will be obvious (properties and methods that are never called) but some of it could be code for use cases that rarely happen in production.

Can a Configuration Coverage Tool Be Built?

I thought about doing this for the Sync Engine, and for the FIM Service a while ago but decided against actually building it because there are probably very few people that would actually use it.  Maybe as FIM integrators move more towards devops it will be interesting.  Hopefully it will since configuration coverage would be a really fun tool to build. 

Tuesday, August 28, 2012

Get the Email Address of the Current User

Ran into an issue with a scheduled job that was sending emails.  Our SMTP host does not allow messages where the FROM address does not belong to the user submitting the messages (so I can’t spoof my boss’s email).  The script I was scheduling will run on different servers using different credentials.  V1 of my script hard coded the FROM address, and those messages were being rejected by our SMTP host.  Instead of updating the scripts so that the FROM address matched the credentials of the Scheduled Job, I used this script snippet to discover the email address:


### Get the Email address of the current user



    ### Get the Distinguished Name of the current user

    $userFqdn = (whoami /fqdn)


    ### Use ADSI and the DN to get the AD object

    $adsiUser = [adsi]("LDAP://{0}" -F $userFqdn)


    ### Get the email address of the user

    $senderEmailAddress = $adsiUser.mail[0]




    Throw ("Unable to get the Email Address for the current user. '{0}'" -f $userFqdn)



Tuesday, August 14, 2012

FIM Service PowerShell Module Samples

Finally had time to catch up on some documentation and added some samples to the documentation for the FIM Service PowerShell module.

These samples are meant to be more scenario based than the examples that are included in the function examples (and there are lots of those too BTW!).

Monday, August 13, 2012

Hidden Gem: FimSyncPowerShellModule.psm1

The interesting player statistic for me this year has been the low number of blog posts and the high number of CodePlex check-ins.  This seems to demonstrate that I am talking less and sharing more.  The catalyst here has been PowerShell.  I’ve become a huge fan of building tools with PowerShell, and perversely enjoy it.  It affords IT Pros like me the opportunity to produce and share useful code without TOO much serious development effort.

The past few years have been very busy and I’ve been able to post some very useful PowerShell functions.  Hopefully I’ll be able to catch up with documentation soon enough, because I don’t think staring at the script is the best way to learn how to use somebody else’s functions.  Luckily I’ve been pretty good about providing comment-based help as I’ve been posting the functions, so generating the documentation is actually quite simple.

This first batch of examples contains some really cool functions that taught me how to make use of some really cool features in PowerShell that allowed them to be rather compact.  Reading through the module you can probably tell that I’ve learned some lessons as I wrote it.  Many of the functions have been re-written a few times, and there are many more functions that are just too knarly to post because I haven’t found a way to make them simple enough to share.

Anyhow, here is the first batch of documentation pages:

Updated FIM PowerShell WF Documentation

For those willing to RTFM, I’ve made some updates to the documentation for the FIM PowerShell Activity.  I’ve been using this thing exhaustively in my deployments, and have even delivered training on it at the TEC conference so figured it was time to lower the bar a little by providing some documentation.

Some highlights include:

  • Accessing FIM Request details from your PowerShell script
  • Adding items to the FIM WorklowDictionary from your PowerShell script
  • Saving the Request details to XML
  • Dumping the Request details to the Event Log

Hey, Where’d the PowerShell Deep Dive Go?

A little bummed that the PowerShell Deep Dive no longer appears on the TEC Europe site.  I’ve been lucky enough to attend a lot of the sessions from the Deep Dive and was really impressed by the energy and enthusiasm, not to mention the great showing from the PowerShell product group and PowerShell community.  Looking forward to seeing where it pops up next!

Monday, July 23, 2012

Chasing References in FIM

Polyarchy has been one of my favourite tools that never shipped from MMS 3.0.  I was happy to see it mentioned by Kim Cameron in a really neat post lately:

Yes to SCIM.  Yes to Graph

Kim talks of the value of references in the data, mainly to show us the importance of the new Graph API but it also pertains to FIM 2010.  Today in FIM 2010 we can query the web service using XPath as the filter dialect.  It turns out to be pretty powerful in its ability to traverse references.  When the schema is designed correctly, the references in the objects can make easy work of rather sophisticated queries.  Unfortunately some of that sophistication is tamed for performance reasons in FIM Set definitions – but otherwise you can still issue some pretty useful queries.

For example, suppose a Request was submitted to create a Group object.  FIM gets busy creating objects to track the workflow and approvals.  On completion, we can use the relationships to easily get detail about the actors, such as this query to get the approver of the request:

$XPathFilter = @"









Export-FIMConfig -only -CustomConfig $XPathFilter | Convert-FimExportToPSObject

The output of the above command is the Person object that approved the Request.

ObjectID    : urn:uuid:d51a311e-ehaa-eheh-98c3-c788b4b55154

AccountName : hoofHearted

CreatedTime : 7/23/2012 6:09:24 PM

Creator     : urn:uuid:306f4a58-ec2c-4a6b-aa9a-6b34ee7588d3

DisplayName : Hoof Hearted?

Domain      : IceMelted

Email       :


ObjectType  : Person

As much as I’ve enjoyed learning about XPath, I can’t believe it has an enduring future in the product.  If FIM follows AD then I believe the time spent with WS.* will be short lived as the Graph API is all about OData/REST.  It seems the journey has been LDAP –> XPath/WS.* –> OData/REST.

Monday, July 16, 2012

APIs are the new integration platform

There is huge momentum behind ‘APIs’ which seems obvious from a software point of view, but APIs are kinda like the new ODBC in that you can easily connect to any number of things with common protocols and encoding. 

From a metadirectory standpoint this is really neat because the integration/reach story has always been the management agent for me.  Today the API plays a key role in the integration/reach story (how many things can you connect to?, Do you have an MA for X?).  This should shift the value of the metadirectory to the integration features it provides, instead of just the number of things it can connect to.

Anyhow, here’s a neat sample of using an API to connect to Bing to provide search results.  There are a huge number of public/free APIs to choose from, including some that come from Microsoft server products like Windows and Active Directory.

Using PowerShell to Query the Bing API

Monday, July 02, 2012

Dell to Acquire Quest Software

I' am a HUGE fan of the TEC conference put on by NetPro, then Quest and hopefully in the future, Dell:

Dell to Acquire Quest Software

Quest is a special company because of their cool products and their contributions to the PowerShell community.  Looking forward to seeing Dell make this even better in the future.

Microsoft Announces Winners and Finalists of the 2012 Partner of the Year Awards

Microsoft partners work hard to get nominated, so congrats to all.  I’ve been watching these for a few years now and this year the results seem different (except for Oxford of course).  As a metadirectory historian I find this year’s results interesting for a few reasons:

  1. There are a lot of systems integrators (sometimes unsung/unknown) doing FIM work (yay!)
  2. There is a trend for LARs to do SI work (who knew CDW would deploy FIM for you?)
  3. FIM is leaving it’s incubation/niche nest

From the press release:

Identity and Security Partner of the Year

· Winner: Itergy

· Finalist: CDW Corp.

· Finalist: Oxford Computer Group Ltd.

The Identity and Security Partner of the Year Award recognizes the partner that has delivered end-to-end security, identity and access solutions enabling customers to achieve their business goals while managing risk and helping to ensure that the right people always have secure access to the information they need to get their jobs done. The winning solution used Microsoft’s security, identity and access products, technologies and solution accelerators, including, but not limited to the following:

· Microsoft Forefront Endpoint Protection

· Microsoft Forefront Protection for Exchange Server

· Microsoft Forefront Online Protection for Exchange

· Microsoft Forefront Protection for SharePoint

· Microsoft Forefront Security for Office Communications Server

· Microsoft Forefront Threat Management Gateway

· Microsoft Forefront Unified Access Gateway

· Microsoft Forefront Identity Manager

The winner has dramatically transformed the security of a customer’s IT infrastructure resulting in higher levels of protection and compliance, reduced IT labor or hardware costs, or streamlined overall operational efficiency.

Neat Article on Hierarchy in SQL Server

Here is a neat article on Hierarchy in SQL Server:

Hierarchies: Convert Adjacency List to Nested Sets

It is relevant to FIM because both the FIM Service and FIM Synchronization Engine deal with sets and hierarchy, and sometimes we do this hierarchy processing outside of FIM (pre-processing) because we need to handle hierarchy differently than the engines do today.

A really great FIM blog post would be to share how FIM processes hierarchy internally, but I’m just not that deep in SQL to go digging for the answer.

This also makes me wonder what the relationship is to .NET 4.0 and the DLR (Dynamic Language Runtime).  We see some great uses of this in PowerShell 3.0 in the form of Intellisense (freakishly awesome!).  While I don’t have a lot of hierarchy challenges in front of me today, I would be really curious to solve them with the ‘blinding speed’ of SQL and the awesomeness of PowerShell.

Tuesday, June 26, 2012

Identity and Access Partner Summit

Attended some of the Microsoft IDA Partner Summit today, and am excited about the level of change coming from Microsoft.  NT4 to Windows 2000 brought us huge leaps forward, and demanded us IT Pros ramp up on more technologies and protocols.  Server 2008 to Azure looks like this decade’s huge leap forward.  In fact I think it was mentioned in jest that they are interested in talked to anybody interested in port 389 on Azure Active Directory.

Many problems are solved in the next releases from Microsoft, but there are always new problems created and fun gaps to fill with integration glue.  

I expect most of what was presented today is also available from recorded TechEd presentations.  Unfortunately I don’t that link handy but Bing should make quick work of it!

Monday, June 04, 2012

PowerShell 3.0 Release Candidate

There is some really exciting integration glue in PowerShell (including Workflow!) and we now have the Release Candidate available for download:

Windows Management Framework 3.0 – RC

I’ve been running the CTP on my laptop, and some servers and have been really happy.  Looking forward to installing the RC now to see some bugs fixed, and some of the newer functionality we saw from TEC 2012 in San Diego.

Saturday, June 02, 2012

FIM 2010 R2 and BHOLD Now Available on MSDN

Very cool, new toys!  It’s been tough getting access to the BHOLD stuff from Microsoft, but now it is shipped and available on MSDN.

I’m now busy planning on removing workarounds for FIM 2010 issues that are now solved in R2 (yay!  less Craig-hacks!) and salivating over the new Filter-Based OSRs, which is going to allow me to do some configuration clean up, and say good-bye to a lot of EREs.  Not to mention the whole new product now available, BHOLD.  Gonna be a fun summer!

Wednesday, May 09, 2012


Gotta love the discoverability of PowerShell.  It is so easy to find out what the platform can do for you, all without looking at the documentation.

Today I was importing a FIM Sync configuration using the trusty MIIS.MA.Config PowerShell snap-in (that has been around since ILM 2007, and still works with FIM despite the legacy name!).  I tend to press TAB instead of typing full command names in PowerShell, and this morning I pressed TAB too many times and to my surprise there was a new command!

This new cmdlet allows scripted configuration of a FIM MA, including setting the password.  This is one of the new things that I had not been automating in my deployments, and now I can get to 100% automation for more scenarios.


BTW – I assume this cmdlet is only in the R2 version of the snap-in. 






    Set-MIISFIMMAConfiguration [-AuthenticationMode <String>] -Credentials <PSCredential> [-DatabaseName <String>] [-Da
    tabaseServer <String>] [-FIMServiceBaseAddress <String>] [-MAName] <String> [-WarningAction <ActionPreference>] [-W
    arningVariable <String>] [-WhatIf] [-Confirm] [<CommonParameters>]  



    -AuthenticationMode <String>

        The authentication mode for the FIM service database connection.
        Required?                    false
        Position?                    named
        Default value               
        Accept pipeline input?       false
        Accept wildcard characters?  false

    -Credentials <PSCredential>

        Credentials for authenticating with the FIMService database.
        Required?                    true
        Position?                    named
        Default value               
        Accept pipeline input?       false
        Accept wildcard characters?  false

    -DatabaseName <String>

        The FIM Service database name.
        Required?                    false
        Position?                    named
        Default value               
        Accept pipeline input?       false
        Accept wildcard characters?  false

    -DatabaseServer <String>

        The FIM Service database server.
        Required?                    false
        Position?                    named
        Default value               
        Accept pipeline input?       false
        Accept wildcard characters?  false

    -FIMServiceBaseAddress <String>

        The FIM service base uri.
        Required?                    false
        Position?                    named
        Default value               
        Accept pipeline input?       false
        Accept wildcard characters?  false

    -MAName <String>

        The name of the management agent to update. Get this name from the MIIS tool.
        Required?                    true
        Position?                    1
        Default value               
        Accept pipeline input?       false
        Accept wildcard characters?  false

    -WarningAction <ActionPreference>    

        Required?                    false
        Position?                    named
        Default value               
        Accept pipeline input?       false
        Accept wildcard characters?  false

    -WarningVariable <String>     

        Required?                    false
        Position?                    named
        Default value               
        Accept pipeline input?       false
        Accept wildcard characters?  false


        Required?                    false
        Position?                    named
        Default value               
        Accept pipeline input?       false
        Accept wildcard characters?  false


        Required?                    false
        Position?                    named
        Default value               
        Accept pipeline input?       false
        Accept wildcard characters?  false


        This cmdlet supports the common parameters: Verbose, Debug,
        ErrorAction, ErrorVariable, WarningAction, WarningVariable,
        OutBuffer and OutVariable. For more information, type,
        "get-help about_commonparameters".






    --------------  EXAMPLE 1 --------------
    C:\PS>set-MIISFIMMAConfiguration -AuthenticationMode integrated -Credentials <credential object> -DatabaseName FIMS
    ervice -DatabaseServer localhost -FIMServiceBaseAddress
http://localhost:5725 -MAName "Fabrikam FIM MA"
    Configures the FIM MA database connection info and FIM service URI.


Monday, May 07, 2012

FIM PowerShell Modules Released

Recently I delivered a workshop at The Experts Conference in San Diego.  The delivery lasted four hours, all involved are healthy and the proud recipients of some PowerShell modules.  The modules are now released on CodePlex:

The star of the show is really the FIM Service module.  It depends heavily on the FimAutomation PowerShell Snap-In which ships with FIM.  This buys the module a nice support story since the cmdlets in that snap-in are released and supported by Microsoft.  While the cmdlets are intended for configuration migration, the functions in the module demonstrate that they can also be used to accomplish CRUD operations against the FIM Service, and be combined for simple yet powerful automation scenarios in FIM.

Everybody using FIM should be doing automation like this, and I hope the modules illustrate how PowerShell amplifies skill sets and extends the reach and extensibility of products like FIM.

If you like the module then please file a review on CodePlex.  If you don’t like the module, then maybe skip the review and instead please file bugs for any issues you find.  Happy automating!

Monday, April 23, 2012

FIM Reporting

Very interesting how a feature gap can spur partners into action.  Here is a neat looking solution from some experts (including a fellow MVP!) in Poland:

Predica: FIM Reporting

Friday, April 13, 2012

FIM Scripting Workshop at TEC Sold Out!

Wow, I’m frankly shocked that so many people are interested in automation and extensibility.  Guess it is too late to up the bribe – I’m bringing TWO books for each workshop attendee:

I decided to bring eBooks because I don’t trust myself to NOT mess up the shipping!

Using PowerShell to Query for FIM Requests with DateTime

Just posted this to the TechNet Wiki:

How to Use PowerShell to Export Requests Since a Given DateTime