Wednesday, November 23, 2011

FIM 2010 R2 RC is Available

Fresh off the press:

Forefront Identity Manager 2010 R2 Release Candidate Now Available

FIM 2010 R2 includes the big new reporting feature based on System Center (I’m more partial to PowerShell reporting of course).  This release also includes fixes to the FIM PowerShell snap-in, such as the ability to import DateTime attributes.

Wednesday, October 26, 2011

Quit Blocking My Pipeline!

A really cool feature of PowerShell is the async nature of the pipeline, whereby the first command outputs to the next command as soon as it has any results (so BEFORE it is done collecting ALL of its results).

Sometimes because of filtering or selection you only want a subset of the results from the previous command.  In these cases it is nice to not have to wait for that first command to finish, but if that command blocks the pipeline you get to wait for it to complete.

This is one of the annoyances of Export-FimConfig.  Though it is a very useful cmdlet it does block the pipeline.  You can see from my script below that I am only looking for the first item from the output.  I can only get that first item AFTER all the results are returned.

If this is an annoyance for you too then feel free to vote for the feedback on Connect.

A reasonable workaround is to instead write your own cmdlet to replace Export-FimConfig (such as the FIM cmdlets on CodePlex thanks to Quest), but it sure would be nice for this to just work.


Monday, October 24, 2011

FIM Protocol Docs

I seem to mention these things a lot, but never show where to find them.

The first hint was from Joe Schulman (ex-FIM PM)

As a small part of the the announcement about Office 2010 Technical Preview, Microsoft published the Office 2010 protocol documents. A small and dedicated group of us worked hard and long for these past months to release documents related to FIM’s protocols on time. Of particular interest to this audience may be the FIM web service protocol document.

The protocol documents are part of Microsoft’s earnest commitment to interoperability. More information about this program can be found here.

There happen to be quite a few protocol documents for FIM, each document name starts with “User Profile Synchronization (UPS)”.  Why THAT name?  My guess is that these were published because SharePoint took a dependency on FIM when they delivered their component for User Profile Synchronization.  Anyhow, there is the list:

The most interesting protocol document in terms of deployment automation is MS-UPSCDS (Configuration Data Structure) since it details the ma-data and mv-data XML.  Get those right and you can manage the sync engine via the FIM Service, that’s the theory anyway…

Tuesday, October 11, 2011

TEC 2012 in San Diego!

I am a huge fan of Vegas, but am equally excited about TEC in San Diego next year.  While I’m sure there’s great mountain biking in the area, I’m also pretty excited to see what’s in store for the PowerShell Deep Dive around that time when we might get more PowerShell 3.0 sessions.

Thursday, October 06, 2011

Are You Master of Your FIM Domain?

FIM enjoys this strange place somewhere between the skillset of an IT Pro and the skillset of a Developer.

The product used to require great efforts (and maybe some goats) for a successful deployment.  FIM 2010 promised to reduce the effort by adding in declarative provisioning, but the result has to be weighed against the added complexity of the new components in FIM 2010 (workflow, portal, SSPR, and soon SSRS and System Center).

A comment from this blog post indicates what might happen when the weight of a product’s complexity is just too much:

Deep domain knowledge is frequently transferred. It takes a little while, but a team’s existing experts are usually pleased to instruct an outstanding new team member. If they aren’t or an area is far too complex for even sharp people to comprehend, then you’ve got other serious problems.

If you’ve mastered FIM then you are indeed an expert with deep domain knowledge.  For over a decade that expertise has been in high demand.  You could almost relate the complexity of the product to the demand for our expertise.  The comment above is a reminder that there is a limit to just how complex a solution can be before it becomes a problem itself.

The moral of the story is that we need to apply practices such as those in the blog author’s book in order to reduce the complexity of FIM deployments, and also improve the stability of these deployments. 

Thursday, September 29, 2011

Can’t Use XPath Contains Function to Query Sets in FIM

This kinda surprised me.  I tried to do this and it failed:

001
002
003
004

###
### Find all Sets where the Filter uses 'myAttribute'
###

Export-FIMConfig -CustomConfig "/Set[contains(Filter, 'myAttributeName')]"

Thinking I had a problem with my syntax (the usual suspect) I changed the attribute from Filter to DisplayName and the query succeeded.  It wasn’t the query that I wanted, but it demonstrated that my syntax was correct.  I guess this means you can’t make this query in FIM.  This was a bit annoying, but there is a simple workaround.

The workaround is to just export ALL the Set objects, then use the PowerShell Where-Object cmdlet to do the filter.  Way more expensive, but who cares?  I’m not doing this all that often and I take slight pleasure in punishing the service for not accepting my leaner query to start with.

Here is the workaround script:

001
002
003
004
005
006

###
### Find all Sets where the Filter uses 'myAttribute'
###

Export-FIMConfig -CustomConfig "/Set" |
    Convert-FimExportToPSObject |
    Where-Object {$_.Filter -ilike "*myAttribute*"}

NOTE: find the Convert-FimExportToPSObject here.

Tuesday, September 27, 2011

Debugging FIM Workflows

Most blog posts enjoy the theme of sharing something that was hard to learn.  This blog post shares nothing but failure, except for the small glimmer of hope that somebody will point out that I am just missing something obvious.

Windows Workflow Foundation does a nice job of providing a canvas for drawing code.  It is a visual representation of what we would otherwise have done in code.  This visual representation is very convenient when debugging (or just learning) workflows.  The ability to set a breakpoint on an item on the workflow canvas is a very intuitive experience.

This is the thing I have not been able to figure out.

The FIM Debugging Guidance on MSDN only shows how to debug managed code in a FIM WF.  What I’m really after is what Bahram asked for back in the RDP: debugging the WF code type.

What I really want to do is watch the debugger through the WF designer, as opposed to setting a breakpoint in the code then trying to correlate.

If anybody has been able to figure this out, I’d love to know the secret!

BTW – it is not a case of missing symbols or selecting the wrong code type.  I can demonstrate breakpoints working in the same class inside code activities, but failing for the actual WF breakpoint. Arg!

Using RegEx to Validate FIM Service GUIDs

Sometimes it is useful to validate a GUID before using it to script against FIM.  I’ve been using the regex pattern below in PowerShell scripts, and parameter validation scripts.

001
002
003
004
005
006
007
008
009
010

###
### Regex pattern to test a UUID from FIM
###

$regExPatternForFimUuid = "^(urn:uuid:){0,1}[0-9a-fA-F]{8}-[0-9a-fA-F]{4}-[0-9a-fA-F]{4}-[0-9a-fA-F]{4}-[0-9a-fA-F]{12}"

### Test 1 - should return TRUE when GUID is prepended with 'urn:uuid:'
'urn:uuid:fea6a1cc-0ee3-4aa6-aba7-ad339d6cab5f' -match $regExPatternForFimUuid

### Test 2 - should return TRUE when GUID is not prepended with 'urn:uuid:'
'fea6a1cc-0ee3-4aa6-aba7-ad339d6cab5f' -match $regExPatternForFimUuid

Monday, September 26, 2011

debug.MakeCurrentUserAdministrator

<WARNING>

!!! DO NOT DO THIS ON A PRODUCTION FIM SERVER !!!

</WARNING>

If you have accidently whacked your FIM administrator object in the FIM Service, there is a rescue utility in the FIM Service database in the form of a stored procedure named ‘debug.MakeCurrentUserAdministrator’.

From what I can see, this procedure will add the current user to the Administrator Set in the FIM Service database.  It does not require any parameters, so is quite easy to execute. 

Caveats:

1. The current user must exist in the FIM Service database already

2. The operation is not logged in the FIM Service request history

This is a good rescue utility for a lab environment, but I would not use it on a production server because I’m pretty sure Microsoft doesn’t support it.

PowerShell at the Core of Microsoft Server 8

‘Windows’ was intentionally dropped from the title here.  This article is a good read, and a demonstration of how serious Microsoft is about PowerShell.  If you were waiting to learn PowerShell, the time to stop waiting is NOW.

Windows Server 8: The Microsoft Server Fork

Some interesting points from the article:

Microsoft’s lead server architect is also the “inventor” of the PowerShell scripting methodology, whose command list will exceed 2300 native commandlets in Windows 8

Each Windows 8 version can be strongly PowerShell-controlled, and optionally with traditional GUI

Windows Server administration was encouraged to be PowerShell-driven, rather than through the maze of administrative GUIs that have been the mainstay of Windows Server versions for nearly two decades

Friday, September 23, 2011

Microsoft Adds BHOLD Technology Assets

Pretty neat announcement, and I’m excited to see what happens.  If the BHOLD stuff gets lopped into a FIM box with a new splash screen then I think that would be a fail.  In fact, since GRC is already growing in System Center you could argue FIM + BHOLD is  the wrong pairing.
When Zoomit was acquired back in 1999 we had the initial release of Via that amounted to some rebranding but mostly the same code until 2003 when the next major release represented true integration into the Microsoft platform.
When Alacris was acquired and added to the ILM box we had the same initial release (mostly branding changes) but the next major release did not enjoy major changes or integration (Just ask Brian Komar to spot the differences…).  Instead FIM today is composed of three products (Sync, Service and Certificate Management) each with their own set of APIs, databases, user interfaces, policy engines, etc.  While the amount of features is compelling, the cohesion could be better (FWIW – I am NOT saying that competitor products are any better, just that I have a higher expectation of software from Bill’s software hut).
My expectation in the short term is that BHOLD will simply be rebranded.  Longer term I hope the technology is truly integrated into FIM or System Center, or into some fascinating Azure concoction. 
For good TV, think about what happens to the other ISV partners that provide the same technology to Microsoft customers.  What happens when the 800 pound gorilla makes THIS step?
The TEC conference in Germany next month just got a little more interesting…
Update: Kinda neat, this blog gets mention in BHOLD's press release.

Wednesday, September 21, 2011

PowerShell 3.0 Workflows

An early version of PowerShell 3.0 is available for download:

The next version of PowerShell is exciting enough, but THIS is just too much:

Windows PowerShell 3.0
Some of the new features in Windows PowerShell 3.0 include:

  • Workflows
    Workflows that run long-running activities (in sequence or in parallel) to perform complex, larger management tasks, such as multi-machine application provisioning. Using the Windows Workflow Foundation at the command line, Windows PowerShell workflows are repeatable, parallelizable, interruptible, and recoverable.

I like really really love PowerShell because it is a strong automation and integration platform, which is basically what we use FIM for.  Oh, and it is actually FUN to use.  Sometimes I joke that FIM is just a few good PowerShell cmdlets away from being replaced entirely by PowerShell.  That sounded funny to me when I first said it a couple years ago, but it seems a little more real now that PowerShell supports workflows.

Hopefully I’ll be able to demonstrate this at TEC in Frankfurt next month during my AD PowerShell session, which is basically about using PowerShell for AD automation and integration when you can’t use FIM.  Today that might be the poor man’s replacement for FIM.  Tomorrow that might just be the replacement for FIM, who knows…

Tuesday, September 13, 2011

Find Permission Granting MPRs

When troubleshooting sometimes I need to find the MPR that grants permission to an attribute.  The script below just issues a query to FIM to find the MPRs that grant access to the attribute.

‘ActionParameter’ is an interesting case because on the surface it looks like it should be a reference, because the UI provides a dialog that resembles the identity picker.  The attribute is not a reference though, as you can see in the output below it comes out as a string.  Compare this to the other attributes in the MPR that are indeed references, such as Creator and PrincipalSet.

In the sample below I use an extra variable to stretch out the XPath filter.  I find this much easier to read, instead of cramming the filter into a one-liner.

001
002
003
004
005
006
007
008
009
010

$filter = @"
/ManagementPolicyRule
[
  ActionParameter = 'HasAccessToStuff'
  and
  GrantRight = 'True'
]
"@

Export-FIMConfig -Only -CustomConfig $filter |
 
  
Convert-FimExportToPSObject

ObjectID                 : urn:uuid:7a797e38-ad64-4001-8c24-9a872826c2d4
ActionParameter          : {AccountName, HasAccessToStuff, HoofHearted}
ActionType               : {Modify}
CreatedTime              : 9/8/2011 4:35:26 PM
Creator                  : urn:uuid:7fb2b853-24f0-4498-9534-4e10589723c4
Description              : This MPRS grants permission to IceMelted
DisplayName              : HoofHearted can Modify Access to stuff and things
GrantRight               : True
ObjectType               : ManagementPolicyRule
PrincipalSet             : urn:uuid:25a42597-1b6b-4221-b7d4-63a0a8b6a2b0
ResourceCurrentSet       : urn:uuid:8887df8e-6e84-49f2-a794-f9e9802077e0
ResourceFinalSet         : urn:uuid:8887df8e-6e84-49f2-a794-f9e9802077e0
ManagementPolicyRuleType : Request

Tuesday, September 06, 2011

PowerShell Deep Dive Content from TEC 2011 in Vegas

The most exciting part about TEC this year is the new PowerShell Deep Dive track.  The cult-like following once enjoyed by the metadirectory seems to be alive and well in the PowerShell community, and was finely gathered at TEC in Vegas.  For those going to TEC in Frankfurt, you’re in luck as there will again be a PowerShell Deep Dive track.

Anyhow, Dmitry has been kind enough to post the PowerShell Deep Dive Content for those that missed it.

Oh, also posted are the abstracts for the PowerShell Deep Dive track in Frankfurt.

Tuesday, August 30, 2011

Using PowerShell to Modify a FIM RCDC

Ever need to automate the deployment of a new UI control to a FIM Portal? 
A quick search of the FIM wiki for Resource Control Display Configuration will show you the pre-requisites you need to appreciate before fully grasping the challenge at hand :-|
In short: RCDCs are a handy way of allowing FIM Portal customization by adding controls as XML stuffed into an attribute in an object that resides in the FIM Service.  One challenge here is that the XML can be rather large, and it is not validated on import which leads to some pretty fun troubleshooting.  Using this script approach you can prevent people from doing this manually by providing them with a script that automates the process.

The approach I took with this script was to edit an RCDC in place by adding the new control’s XML to the existing RCDC XML.  I could have finished a lot sooner with a heavier hand if I’d just replaced the whole RCDC XML, instead of grafting my single control in.  There’s something to be said for finishing earlier but I wanted to see if this could easily be done, and that is what I’m trying to show in this blog post.

Working with Namespaces
The main challenge I found in this script was working with namespaces.  RCDC schema uses a few namespaces declared at the top of the document.  Adding a new control by hand you don’t typically include these namespaces because they are already at the top of the document.  I couldn’t find an easy way to do this, so I had to include the namespace declaration in my new control’s XML.  It is redundant but AFAIK it is not incorrect (at least it doesn’t seem to make the server angry).  Skipping this little trick took the simplicity out of the script and made it pretty ugly because I really wanted to use the XML functionality in .NET instead of doing string manipulation.

There’s one trick/function in the script below that I haven’t posted about yet.  It is basically another wrapper for Import-FimConfig.  Look for more on that later as I want to share and demo it at TEC Europe.

Anyhow, on with the script.
001
002
003
004
005
006
007
008
009
010
011
012
013
014
015
016
017
018
019
020
021
022
023
024
025
026
027
028
029
030
031
032
033
034
035
036
037
038
039
040
041
042
043
044
045
046
047
048
049
050
###
### Update the RCDC for User Edit
###

### Get the existing User Edit RCDC

$userEditRcdc = Export-FIMConfig -OnlyBaseResources -CustomConfig "/ObjectVisualizationConfiguration[DisplayName='Configuration for User Editing']" | Convert-FimExportToPSObject
[XML]$rcdcXml = $userEditRcdc.ConfigurationData

### This is the new control we want to add
$newRcdcControl = 
@"
<my:Control
xmlns:my="http://schemas.microsoft.com/2006/11/ResourceManagement"
    my:Name='MyNewCheckBox'
    my:TypeName='UocCheckBox'
    my:Caption='HoofHearted?'
    my:Description=''
    my:RightsLevel='{Binding Source=rights, Path=isStinky}'>
    <my:Properties>
        <my:Property
            my:Name='Checked'
            my:Value='{Binding Source=object, Path=isStinky, Mode=TwoWay}'/>
    </my:Properties>
</my:Control>
"@


### Put the new Control into an XML fragment
$fraggle = $rcdcXml.CreateDocumentFragment()
$fraggle.InnerXML = $newRcdcControl

### Find the Tab where we want to place this Control
$WorkInfoTab = $rcdcXml.ObjectControlConfiguration.Panel.Grouping | Where-Object {$_.Name -eq 'WorkInfo'}

### Find the Control to place this one AFTER
$namespace = @{my="http://schemas.microsoft.com/2006/11/ResourceManagement"}
$EmployeeIDControl = Select-Xml $rcdcXml -XPath "//my:Control[@my:Name='EmployeeID']" -Namespace $namespace

### Insert our new control
[Void]$WorkInfoTab.InsertAfter($fraggle,$EmployeeIDControl.Node) 

### Update the RCDC in FIM with our updated XML
$rcdcUpdate = New-FimImportObject -ObjectType ObjectVisualizationConfiguration -State Put -AnchorPairs @{DisplayName='Configuration for User Editing'-Changes
 @(
   
New-FimImportChange -Operation Replace -AttributeName ConfigurationData -AttributeValue ($rcdcXml.
OuterXml) 
)

$rcdcUpdate | Import-FIMConfig

###
### Cycle FIM and IIS
###

Restart-Service fimservice
iisreset

Before



After (notice the new HoofHearted item)


Going to TEC 2011 Europe!

Woo-hoo!  This time I’m giving talks on:

  • Extending and Automating FIM with PowerShell
  • Managing Active Directory with PowerShell
  • Using SQL Reporting Services to Expose PowerShell Script Output

Needless to say I’m pretty excited about the PowerShell Deep Dive track, and hope to spend a lot of time learning from the extremely high concentration of PowerShell talent and excitement. 

If you’re a FIM integrator in Europe, then this is a must-attend conference.  You won’t find a more concentrated bunch of integration and automation folks in such an accessible setting.

Found this great testimonial on the event site:

“The best analogy I have for TEC is the video for “No Rain” from Blind Melon where that funky little bee-girl runs around seemingly confusing people as she dances around dressed like a bee in tap shoes. Identity and Access is just like that, we spend all year telling people about it, customers eventually get it, relatives just smile, and spouses do their best, but at TEC we find ourselves surrounded by people speaking the same jargon even if their native language is different, our acronyms are harmonic.”

- Craig Martin

Wednesday, August 17, 2011

Linking to an Excellent Post on Testing

I really like this post so thought I’d pass it on…

Challenges in Test–Proving your feature

Testing FIM for customer deployments is near and dear to my heart because it is a target rich environment for automation (PowerShell!!!).  I find it fascinatingly difficult to motivate anybody to spend time on it until thing are foo-bar.

Friday, July 15, 2011

Querying for Pending or Finished FIM Requests

Just a follow-up to a previous post.  Here are two sample FIM XPath filters based on the RequestStatus values.

001
002
003
004
005
006
007
008
009
010
011
012
013
014
015
016
017
018
019
020
021
022
023
024
025
026
027
028
029
030
031

###
### Find pending requests
###

$xpathFilter = 
@"
/Request
    [
            RequestStatus != 'Denied'
        and RequestStatus != 'Failed'
        and RequestStatus != 'Canceled'
        and RequestStatus != 'CanceledPostProcessing'
        and RequestStatus != 'PostProcessingError'
        and RequestStatus != 'Completed'
    ]
"@

$requests = Export-FIMConfig -CustomConfig $xpathFilter

###
### Find finished requests
###

$xpathFilter = 
@"
/Request
    [
           RequestStatus = 'Denied'
        or RequestStatus = 'Failed'
        or RequestStatus = 'Canceled'
        or RequestStatus = 'CanceledPostProcessing'
        or RequestStatus = 'PostProcessingError'
        or RequestStatus = 'Completed'
    ]
"@

$requests = Export-FIMConfig -CustomConfig $xpathFilter

News on FIM Plans for Cloud Integration

At its core, FIM is just an integration tool, and identity management has always been about the applications.  This is an exciting time for FIM and identity management because ‘to the cloud’ seems to be great motivation to drive change into applications that are normally really hard to change.

My overall take is:

  1. Federate where you can
  2. Synchronize where you must

But it is never this simple, and tools like FIM will always be required so it is good to hear that Microsoft has plans for FIM to make cloud integration easier.

To me this is analogous to FIM’s early days when Active Directory was the big push, and FIM (then called MMS) was used to accelerate the adoption of Active Directory by getting application data into the DS thereby making it a reasonable target for other applications (remember directory-enabled applications?).  If I ruled the world (or at least a significant amount of Microsoft) then I’d again position FIM in this way, only this time using it to accelerate adoption of Azure/Office365 by easing eliminating the pain of the hybrid enterprise.

Anyhow, maybe the new article above is an indication of Microsoft moving FIM in this direction.  Time will tell, at least until I rule the world.

FIM Service RequestStatus – What Are All the Possible Values?

I’m writing a small function to tell me when a FIM Request has completed, so need to know the possible status values of a FIM Request.  Some digging around in Visual Studio using Class View unearthed an Enum for this in the FIM DLL.

The PowerShell one-liner below will display all the possible Request status values (called RequestStatusType).

 

001
[Enum]::GetNames([Microsoft.ResourceManagement.WebServices.WSResourceManagement.RequestStatusType])

Below are the results of the one-liner, showing all the possible FIM Request status values.  HINT: the highlighted ones represent a FIM Request is that is done (can’t reach any other status).

  • Denied
  • Validating
  • Canceling
  • Validated
  • Authenticating
  • Authenticated
  • Authorizing
  • Authorized
  • Failed
  • Canceled
  • Committed
  • CanceledPostProcessing
  • PostProcessing
  • PostProcessingError
  • Completed
  • NotFound

Thursday, July 07, 2011

Convert a FIM ExportObject to a PowerShell PSObject

Working with the output from Export-FimConfig is not always fun because you have to dig hard to get attributes off an object.

This function I just whipped up will convert a FIM ExportObject to a PowerShell PSObject.  The advantage here is that you can then use dot-notation <sp?> to dig out the attributes.

For example the Export-FimConfig used like this produces the output below.

001
Export-FIMConfig -CustomConfig "/Person[AccountName='hi']"

Source : http://localhost:5725/ResourceManagementService
ResourceManagementObject : Microsoft.ResourceManagement.Automation.ObjectModel.ResourceManagementObject

Source : http://localhost:5725/ResourceManagementService
ResourceManagementObject : Microsoft.ResourceManagement.Automation.ObjectModel.ResourceManagementObject

Now using the fancy new function we get:

001
Export-FIMConfig -CustomConfig "/Person[AccountName='hi']" | Convert-FimExportToPSObject

ObjectID            : urn:uuid:caf0178b-b8c1-41b2-bd71-f2f48b1fdf3b
AccountName         : HI
CreatedTime         : 7/8/2011 6:02:50 AM
Creator             : urn:uuid:7fb2b853-24f0-4498-9534-4e10589723c4
DisplayName         : HoofHearted IceMelted
Domain              : africa
FirstName           : HoofHearted
IsRASEnabled        : True
JobTitle            : Sheller
LastName            : IceMelted
MailNickname        : HI
ObjectType          : Person

In addition, it is easier to get at the attributes for each object:

001
002
003
004

$person = Export-FIMConfig -CustomConfig "/Person[AccountName='hi']" | 
   
Convert-FimExportToPSObject
$person.DisplayName
$person.AccountName

HoofHearted IceMelted

HI

Finally, here is the function:

001
002
003
004
005
006
007
008
009
010
011
012
013
014
015
016
017
018
019
020
021
022
023
024
025
026
027
028
029

Function Convert-FimExportToPSObject
{
   
Param
    (
       
[parameter(Mandatory=$true, ValueFromPipeline = $true)]
        [Microsoft.ResourceManagement.Automation.ObjectModel.ExportObject]
        $ExportObject
    )
   
Process
    {       
       
$psObject = New-Object PSObject
        $ExportObject.ResourceManagementObject.ResourceManagementAttributes | ForEach-Object
{
           
if ($_.Value -ne $null
)
            {
               
$value = $_.
Value
            }
           
elseif($_.Values -ne $null
)
            {
               
$value = $_.
Values
            }
           
else
            {
               
$value = $null
            }
           
$psObject | Add-Member -MemberType NoteProperty -Name $_.AttributeName -Value $value
        }
       
Write-Output $psObject
    }
}

Thursday, June 23, 2011

New-WebServiceProxy and 401: Unauthorized

Feeling mildly dumb about this one, but using New-WebServiceProxy recently and spent too much time troubleshooting the 401 errors it was throwing me.

After fiddling with UAC and proxy settings I finally looked at the New-WebServiceProxy cmdlet help.  The handy ‘UseDefaultCredential’ of course allowed the cmdlet to use my security context, and get me access.  Duh.

Always a Bridesmaid…

For the second year in a row Edgile (the company tolerant enough to call me their employee) has achieved ‘Finalist’ status for the Microsoft Identity and Security Partner of the Year.

It has been a great ride working at Edgile and I’m very impressed with the teams we have assembled.  There’s always next year!

Tuesday, April 26, 2011

Learn PowerShell in a Month of Lunches

I was lucky enough to attend PowerShell training by Don Jones at the TEC conference.  Also I talked a lot about PowerShell in my FIM sessions at the conference.

If you are looking for a good resource to ramp up on PowerShell I suggest his latest book:

Learn Windows PowerShell in a Month of Lunches.

Reading this book will not only teach you how to use PowerShell, it will make you better at designing and deploying FIM.

Thursday, April 21, 2011

Holy Kittens!! PowerShell at TEC

TEC adopted a new track this year, PowerShell!  This was an exciting addition, for a number of reasons:

  1. It was driven by the PowerShell folks at Microsoft (Jeffrey Snover and several other PowerShell people from Microsoft).
  2. PowerShell draws a cult-like following of really freaking smart IT people
  3. Quest has some serious PowerShell chops, so hosting a PowerShell deep-dive is a perfect fit

I was lucky enough to attend to pre-conference training by Don Jones.  If you have time for ANY training, I say take PowerShell training.  It will make you better at what you do. 

Wednesday, April 20, 2011

Logging in FIM Workflow Activities

The .NET Framework provides an awesome facility for logging.  Employing this logging facility in FIM Workflow Activities enables us to change the logging behaviour of our workflow DLLs by changing the FIM service config file (or some other way – it is really up to you at design time). 

By default this logging facility is configured in a .config file for the hosting process (in this case FIM) but it is possible to enable the logging configuration to be configured in the workflow itself.

Anyhow, I’ve employed this logging facility in the PowerShell WF Activity for FIM and I think it is a good example of how to do logging in a FIM workflow activity.  If you’re keen on logging, then please take a look and give me some feedback!

The XMA Grows Up

At TEC this week we learned about a new MA framework coming to a FIM server near you!

The nickname for the new framework is ‘EZMA’.  My first take at this nickname was irony; since the framework provides was more functionality than the XMA SDK.  This raises the bar for the development expertise required to develop an MA using the new framework (it is REALLY easier?).  Raising this bar is a good thing IMHO since it makes it less enticing for non-developers to produce MAs, which is probably a good thing.

After going through the new SDK I realized it really is an EZMA because the added functionality means I no longer need to do a lot of things/hacks in my MA code.  For example, the call-based import functionality relieves me of the need to transform m objects into a file.  I have a lot of DSML code that now gets to retire!  Another example, on import I can now provide an update (partial object) instead of having to supply the complete attribute set.  This means that a lot of code in the OpenLDAP XMA can also retire since it has to chase changeLog entries into the actual directory entry to get the full attribute set.

Brian Desmond did an awesome job explaining this in his session today.  He’s done the early adopter work of building some MAs using the new framework and was able to tell us about it at the conference today.

Reporting for FIM (and EVERYTHING else using SSRS and PowerShell)

Just got home from TEC 2011 in Las Vegas, and whoa what a great time!  There were lots of FIM gurus there, but new this year was a track dedicated to PowerShell.  I’m pretty excited about PowerShell, so the additional track created a LOT of conflicts for me.  My ideal TEC would be a couple days longer so I could see ALL the awesome talks, but I’m not sure my liver or wallet could handle it.

REPORTING!!!!

This is a fun feature gap in FIM to address because it helps us realize that we’re really just DBAs ;-)

My first approach to this challenge was to extend SSRS using a Data Processing Extension (DPE) to query the FIM web service.  We have customers using this successfully in production today, but I wanted to add functionality to it, including enabling reporting against FIM Sync.  Turns out PowerShell is REALLY good at getting objects, allowing you to flatten it down into something consumable by a report.  After some prototyping I was able to get a DPE working with PowerShell, so any PowerShell pipeline can populate an SSRS dataset.

I’ve posted this prototype onto CodePlex, and am using for some report challenges (including FIM).  Please feel free to download it, try it out, and provide some feedback!

http://psdpe.codeplex.com

Tuesday, April 05, 2011

Coming to TEC? You have to see my Reporting Session!

It’s obvious you have to go to TEC this year, each year the conference gets better and this year is huge with the addition of the PowerShell track.

Reporting is a rather dull topic to cover but I think I’ve got an angle that is really freaking exciting.

The original sin was FIM shipping without a reporting feature.  In response to this I put a solution together to tie SQL Server Reporting Services (SSRS) to FIM via the FIM web service.  Basically it would allow SSRS to get data directly from the FIM web service without requiring any intermediary database staging.  This was all good but I’ve been looking to solve some of the problems with that approach and my solution is the topic of my reporting session at TEC on Wednesday morning.

To hint at what I’ll be presenting; it has more to do with PowerShell than FIM, but I use it all over the place to do FIM reporting.  The SSRS component and some of the reports themselves will be posted to CodePlex this month, mostly because I anticipate working on them feverishly until Wednesday morning in Vegas ;-)

Goodbye [reflection.Assembly]::LoadFrom, Hello Add-Type!

One of the things I love about PowerShell is that almost everyday I learn something new that makes me better at what I do.  Not just better at PowerShell, but better at getting my stuff done (testing, deploying, automating, integrating, etc).  It’s like that excitement of finding $5 on the sidewalk.  You’re both happy to be a little richer but also harbour that sneaky feeling that you’ve stolen something, in this case you’ve stolen time, and maybe money too.

Anyhow, my $5 bill today is the discovery of Add-Type.

For too long I’ve been using [reflection.Assembly]::LoadFrom to load assemblies, which is analogous to adding references to a C# project.  It is a bit tedious and I don’t feel like I’ve ever mastered it, but now have found Get-Type to be way more user friendly and dependable.

001
002
003
004
005

### Before
[reflection.Assembly]::LoadFrom('C:\Windows\….<path shortened>\System.Security.dll'

### After
Add-Type -AssemblyName System.Security

Tuesday, March 22, 2011

Pipelining in the Export-FIMConfig Cmdlet

It’s no secret I’m a big fan of PowerShell so when FIM ships with some PowerShell cmdlets you know I’m gonna be all over them.  Something kinda surprised me today while toiling around with the Export-FIMConfig cmdlet; it blocks the pipeline.  Cmdlets can return objects to the pipeline immediately, which can be a big performance gain for your script.

I was considering using this cmdlet for the FIM SSRS Data Processing Extension (DPE) because it would buy me all the query functionality I needed in a nice supportable client from Microsoft.  Before even trying this in the DPE I’ve ruled it out because waiting for all the query results would make for a bad interactive report experience – the user would have to wait for all the results before seeing any report results; boo.  Also I don’t know of a way to use the cmdlet to return a subset of attributes which is also important for performance on large reports.

Probably an easy problem to fix.

Also, not really a dead end.  The FIM cmdlets contributed by Quest do not have the same problem.  By looking at the source for their Get-FIMResource cmdlet you can see they’ve implemented the ProcessRecord() method.

Wednesday, February 16, 2011

SPNs Broke My WinRM

Preparing for a demo of FIM Service to FIM CM integration, I planned on using PowerShell and WinRM as the stars of the show.  FIM CM still relies on .NET Remoting (hasn’t changed since pre-CLM days, just ask Brian…).  .NET Remoting is easy enough, but I was keen on using WinRM because:

  • we get it for free on Windows 2008
  • PowerShell is much easier to troubleshoot than compiled .NET code
  • WinRM requires no client side proxies or DLLs

This worked GREAT on a one-box environment, then I stretched it out a little.

In my demo I have FIM Service and FIM Certificate Management running on separate servers in the same forest.  Both FIM CM and WinRM use HTTP as their transport.  Both FIM CM and WinRM enjoy the use of Kerberos to protect said transport.  Only ONE of them can have an SPN identifying the computer account and the transport.

For the life of me I can’t figure out how to trick WinRM into using a different SPN.  The only way I have been able to un-break WinRM is to break FIM CM by deleting it’s SPNs.

The same issue described in this post.  It is also nicely described here.

So at the end of the day I’ve had to workaround this heart break by using .NET Remoting instead of WinRM.  At least it won’t be much work to switch back to WinRM once I figure out a workaround.  I’ve structured my PowerShell module to hide the .NET Remoting crap in a neat little function.

Wish me luck on the demo!  I’ll be posting this stuff to CodePlex this week, win or lose :-|

Friday, February 04, 2011

FIM Sync Engine–More PowerShell Added

Back in November I posted about the ILM Sync Engine PowerShell cmdlets.

Turns out they were added to FIM in a hotfix last year (I’d assumed they were in RTM).

From the KB Article

Features in Sync Engine
Feature 1
A limited set of PowerShell cmdlets are added to allow you to perform some limited editing of the Sync Service configuration.
For more information about these PowerShell cmdlets, visit the following Microsoft Website:

General information about PowerShell cmdlets that let you edit the Sync Service configuration 

So if you are running FIM Sync with a build of at least 4.0.3547.2 then all you need to do is add the snap-in and you’re off to the PowerShell races.  No need to download the MSI!

How Delta Sync and Full Sync Compare During Provisioning

A while back I posted on an obscure bug that the vast majority of customers will never see due to the scale required for it to repro: Reference Depth.

Turns out this taught me a lesson about a key difference between Full Synchronization and Delta Synchronization.

The scenario is this: you ran a sync and have objects pending export in a connector space.  Those objects are pending because you have not exported them to the target system yet, they’re just sitting there waiting to be exported.

If you were to Preview one of those objects from the source system again, using a Full Sync Preview you would see the following in the preview results:

“Auto-Deleted” followed by “Added”

So during a Full Sync of that object, the pending export is whacked, then replaced by a fresh new pending export.  This made total sense to me; something could have changed, our rules fired again, so the new object is created. 

Now if you were to Preview the same object using a Delta Sync Preview you would NOT see this behavior.  This is where I got confused, because I had a breakpoint in my rules code and watched as all the code was called, all the way down to CommitNewConnector().  But after watching that happen, there was no “Auto-Deleted” and no “Added”, there was no status reported for provisioning at all, like it never happened.

Turns out the rules extension IS called on both Delta and Full in this scenario, but on Delta the action is just thrown away silently by the server.  Well, it is only silent if you haven’t littered your code with trace logging, and if you’re not nosing around with a debugger.

The moral of the story here is that I’ve been working with this engine for a decade now and STILL learn new things about it – so one could argue the moral of the story is that I am not the sharpest tool in the shed, but most certainly a tool.

The real morals of the story are:

  • if you need a pending export deleted then added then you do not get this for free on delta synchronizations, only on full.  So code accordingly.
  • test, test, test.  If found this when I was writing test scripts against my MV code. 

Had I not been scripting this test case, I probably would have finished the engagement and nobody would have ever noticed my bug.  As much as it sucked to admit the mistake, I did learn something new about the engine, but most importantly it showed me that better test coverage increases deployment quality (duh).  In the end my tests were smarter than me, providing me with both humility and relief. 

Saturday, January 29, 2011

Going to TechReady in February? Come See My FIM Sessions!

I’ve been lucky enough to snag a couple speaker slots at TechReady12.  If you’re attending then come throw ripe fruit!

  • Thursday, February 17th – PowerShell and Forefront Identity Manager 2010
  • Friday, February 18th – Forefront Identity Manager CM Integration with the Forefront Identity Manager Service

Both sessions will be littered with PowerShell rants, demos and samples as they pertain to FIM extensibility, testing and automation (Thursday) then as PowerShell pertains to the integration story between the three FIM engines (FIM Service, FIM Sync and FIM CM).

PowerShell is a critical IT Pro skill (especially if you are in the habit of deploying FIM):

  • PowerShell is critical to productivity
  • PowerShell is critical to quality

We’re past the point where it is just a cool new technology, so come see how FIM is an example where PowerShell can be used to improve your productivity as well as the quality of your deployments.

Thursday, January 20, 2011

VirtualBox for FIM Development Environments

I prefer to run Win7 on my laptop, and also like to run VMs there too.  Unfortunately this is a pain because there is no desktop virtualization support for x64 guests (at least not that I know of).  This drove me to VMWare, then to VirtualBox.

VirtualBox works like a charm with one exception.  Any time I try to use the guest additions, all hell breaks loose.  It brings the host (my laptop) to its knees until I can manage to stop the offending VM.

Disabling the guest additions solves this issue, which I'm fine with since I'd rather use RDP to access the VMs anyhow.

IE9 Adventure - Promising but short

Got IE9 pushed onto my machine today and it was pretty cool!  Seems a lot quicker than IE8, as promised but after trying every compatibility option I was unable to get it to work with the TechNet forums.

Uninstalling took just a minute, but it introduced an odd quirk whereby every new tab or window would auto-start the IE Developer Tools in a separate window (annoying!).

The trick was to go into the registry and manually set the 'Pinned' key to 1.  Doing this in the browser itself just wouldn't work.

Wednesday, January 12, 2011

Using the Compare-Object Cmdlet

Just a short post on this cmdlet: Compare-Object.

I've found it incredibly useful in a lot of my scripts where I'm diff'ing FIM configurations, but sometimes the results just didn't seem right, which made me lose confidence in the cmdlet until I found this post at PowerShell.com:
Tipps & Tricks Using Compare-Object

Before I read that post I was resorting to boiling my objects down to simple String arrays, which seemed to make the results better.  Thanks to that post I can now compare arrays of objects again, which is a lot simpler in scripts, and a LOT more powerful.

Just today I used it to compare the IAF and EAF rules on a FIM Sync box where each config had ~300 rules.  In seconds the Compare-Object cmdlet was able to tell me that only 30 rules were different between the two configs, and what the specific differences were.

COOL!

Friday, January 07, 2011

Use PowerShell to Create FIM ActivityInformationConfiguration Objects

FIM 2010 can be extended with custom WF activities. Try it out using the TechNet guide:
How to: Create a Custom Logging Activity and Deploy it to the FIM Portal

One of the challenges with creating (for me at least) is getting the details of the Activity Information Configuration object correct.

The script below takes the details directly from the WF DLL that you've built. Just point it to the DLL file and it will figure out the details then create the object in FIM for you.

Trying something new here, cross posting to TechNet Wiki

Thursday, January 06, 2011

MVP'd for 2011!

Thanks to Microsoft for renewing my MVP status for 2011!! It's an honour to be recognized for being a geek, and I have a blast saying MVP-ness ;-)

The FIM MVPs are a small bunch of tenacious contributors I am happy to be part of, and am really looking forward to the MVP summit in February.