Friday, August 20, 2010

Sync Performance Counters

I’m a huge fan of the Perfmon counters and it has always irked me that they fail sometimes.  This is the error I sometimes get when ILM starts:

The server encountered an unexpected error creating performance counters for management agent "HoofHearted".
Performance counters will not be available for this management agent.

This command line tries to install them again, which I have had good luck with recently:

lodctr.exe "C:\Program Files\Microsoft Identity Integration Server\Bin\mmsperf.ini"

It works on my lab boxes but I haven’t been able to get a good repro of the failure so unfortunately can’t say if this will work all the time.

I Sync I Can, I Sync I Can

Working on a large-ish sync deployment lately and get to use some pretty beefy hardware:

A few ProLiant DL580 boxes with 128GB RAM and a RAID Controller full of disks.

Been running some fun sync topologies, including:

  • SQL on the host, Sync in a Guest VM on the same box
  • Sync and SQL in 1 VM with 60GB RAM
  • Sync and SQL both on the same host

So far I’ve only been able to prove that I can’t repro the crash that some people have seen while running sync in VMs.  Hopefully I’ll have time to share some performance stats when I’m done.

Thursday, August 19, 2010

Sync as an Appliance

It’s no secret that I’m a huge fan of PowerShell.  The story behind PowerShell is analogous to Cinderella where Jeffrey Snover defies the odds to create something incredible not only for Microsoft, but for the IT Pro users of their software.

Anyhow… I think the huge success of PowerShell isn’t the feature set itself but in the huge list of teams inside Microsoft using PowerShell to do cool things with their products.  Exchange is an obvious example.  For these teams there are enticing reasons to adopt PowerShell, and for us IT Pros there are huge benefits when more teams ship cmdlets thanks to the compos-ible nature of PowerShell.

FIM should probably also be measured by the success of their internal partners.  I’m obviously a fan of FIM, but really feel that the bar was raised by PowerShell and hope to see better FIM-based solutions in the future.

Unfortunately it is easy to find customers unhappy with a FIM solution.  This one is an interesting experience where the blog author has a funny take on the internal partnering experience between SharePoint and FIM:

More User Profile Sync issues in SP2010: Certificate Provisioning Fun

I think the re-org announced at TEC in LA this year is a good step in that direction and am pretty curious to see what shape the product takes as a result.

Hiding Tabs in FIM 2010 RCDCs

Usually I’m fairly critical about the limited flexibility with RCDCs but it does present some fun challenges if you do decide to employ them in your deployment. 

In the FIM Portal you can configure the RCDCs to add or remove tabs (called ‘Groupings’ in RCDC). Here is the snippet from the RCDC XML Reference explaining this functionality:

Visible: You can hide an RCDC page tab or its heading by setting this attribute to false. By default, this optional, Boolean-type attribute is set to true. This attribute is functional only on a Content Grouping.

Here is an RCDC XML snippet showing an example of a grouping:

<!--Sample for a Header Grouping—>
<my:Grouping my:Name="ContentGroupingSample" my:Caption="Sample Content Grouping" my:Description="Some description for content grouping" my:Visible="true">
<my:Control my:Name="DisplayName" my:TypeName="UocTextBox" my:Caption="Display name" my:Description="This is the display name of the set.">
<my:Properties>
<my:Property my:Name="Required" my:Value="True"/>
<my:Property my:Name="MaxLength" my:Value="128"/>
<my:Property my:Name="Text" my:Value="{Binding Source=object, Path=DisplayName, Mode=TwoWay}"/>
</my:Properties>
</my:Control>
</my:Grouping>
<!--End of Header Grouping Sample-->


Hiding this tab is as easy as setting Visible=”false”, but what if you want to do it dynamically?



One way to do this is to use a Boolean attribute on the object instead of specifying ‘true’ or ‘false’.  This can be an attribute you don’t even show to the user, just something you treat as a system attribute to control the rendering of the object in the RCDC.



Here’s an example:



my:Visible="{Binding Source=object, Path=ShowContentTabA, Mode=TwoWay}"


Now when the RCDC loads it will look at the ‘ShowContentTabA’ attribute to decide if the tab should be displayed or not.

Monday, August 16, 2010

Unexpected-Error (reference depth has reached exceeded maximum of 100)

Chances are you will never run into this, but it’s Monday so I did.

Error

Unexpected-Error per object on a Full Sync.  Details of the error are:

The operation failed because the number of reference depth has reached
exceeded maximum of 100.

Repro Scenario

Running several management agents to populate one CS.  All MAs are importing, and only one is exporting.  Running the first MA is fine, running successive MAs eventually produces this error.  Again, this is a large server (>3M objects) with lots and lots of references – not a reflection of the average scale so I wouldn’t expect many repro’s on this.

Workaround

I suspect this is caused by making the sync engine disconnect then reprovision the objects in the CS.  Instead of running Full Sync on all of the ‘import MAs’, I run Full Sync on the biggest one, then do an Export and Confirming Import on the ‘export MA’.  This allows the other MAs to get through Full Sync without the provisioning-disconnects.

Moral of the Story

Arrange your run profiles to avoid large numbers Provisioning-Disconnects.

FIM Sync References vs. PowerShell Identity Attributes

Just thinking out load here, but Reference attributes in the Sync engine has long been one of my favourite features.  Unfortunately with great power comes some pretty long sync cycles on larger scale systems (this is a non-issue for most deployments).

Reference attributes are powerful because they abstract not only the schema from a connected system, but also the relationships such a group memberships and reporting structures.  The sync engine makes it pretty easy to flow these relationships across systems without a single line of code.  This used to be quite a bit of zScript.

How much complexity is enough to warrant a Sync Engine deployment?

If you’re only synchronizing two systems, I might ague that the Sync Engine is overkill for your solution.  If however you had lots of tricky transformations to handle, including reference attributes then I might fall back to my usual tool of choice (the sync engine).  But what if it was easy to do reference translation in a tool other than the sync engine?  Well there isn’t a great answer here yet, but the point of this post is to think about alternatives. 

PowerShell Identity Attributes

The Exchange team has cmdlets that take ‘Identity’ parameters.  These parameters can be a pain if you’re writing an XMA that expects exports to be re-imported, but once I forgave that I started to wonder if this was not a pain in the ass, but maybe a really cool feature.  The sync engine will figure out the references for you, and export them to the connected system in a format it understands.  Identity parameters in the Exchange PowerShell cmdlets act more like late-binding whereby they take an input and try to match it to the referenced object whenever you run the cmdlet.  Pretty cool, but of course this isn’t common across PowerShell cmdlets.  AFAIK it is specific to Exchange so the usefulness is quite limited, but the potential is there (yes, I’m an optimist).

Does this even matter today?

The answer in MOST cases will be no.  Craig is simply dreaming of alternatives and speaking in the third person again.  However, if I was tasked with synchronizing anything to Exchange and  Exchange only, I might consider this approach.  As soon as you have to export to ANYTHING else, then this approach loses applicability FAST.

The sync engine is still a solid, solid engine for integrating identity repositories but that doesn’t mean it isn’t fun to entertain other approaches.  If your only tool is a hammer…

Wednesday, August 11, 2010

RCDC Troubleshooting



The FIM portal can be extended by modifying XML files called RCDCs. There are a host of controls you can add to the FIM Portal pages, but they have to be hand crafted in by modifying XML then uploading it back to the FIM Service. There are decent resources on MSDN that show how to work with RCDCs:

FIM will let you upload XML files with errors, in return it will treat you to errors in the FIM Portal when you try to open a resource that applies that RCDC.

The FIM team provides an XSD to help troubleshoot this, and there are a number of ways to apply that XSD. Here is how to do it with Visual Studio.

  1. Save your RCDC XML to a file, then open it in Visual Studio
  2. Copy the XSD from the FIM MSDN site (see Appendix A: Default XSD Schema), and open it in Visual Studio:
    1. From the XML menu, select ‘Schemas’
    2. Find your saved XSD
    3. Set the ‘Use’ column to ‘Use this Schema
  3. Bingo!

Now that Visual Studio has the XML file open and the RCDC XSD it will find any issues with your RCDC. The sample below shows what an error looks like:

BTW – I really wanted this to be a cool PowerShell sample.  There is a cmdlet available to do just this but unfortunately the list of instructions was longer.  If you’re interested the Test-XML cmdlet can be downloaded as part of the PowerShell Community Extensions project on CodePlex.
It would look like this:

Returns true or false indicating whether or not RCDC.xml is well-formed and conforms to the schema defined in RCDC.xsd.
Test-Xml RCDC.xml -SchemaPath .\RCDC.xsd

Friday, August 06, 2010

Gearing Up for TEC Europe!

I’m honoured to be speaking at TEC Europe this October.  We’ll see if I can manage to reduce my rate of speech so that somebody besides my wife can decipher what I am trying to say.  Interesting to note that we’re expecting a new baby shortly after TEC so I’ll be hurrying home ;-)

Topics I’ll be covering are:

Automating FIM Deployments with Microsoft PowerShell

In a FIM deployment of any size, administrators will want to automate the management and maintenance of their servers and configuration as much as possible. Come to this interactive and demo-filled session to see real-world examples of Powershell automation and scripts that you can use to improve your FIM maintenance experience. Whether you are new to the Powershell “game” or a seasoned pro, you will find tips, tricks, and advice that you can start using right away within your environment.

Developer Tools for the IT Pro

You’ve sat through presentations telling you how to diagnose a failing project, but how do you revive it?  Turns out as an industry we are very bad at what we do, so sit in and hear tips for successfully deploying FIM projects.  This session balances stodgy methodology coverage with interesting tools and techniques for deployment and test automation.

quest-banner-di

Running ILM Rules Extensions in FIM 2010

The FIM team did some work to make ILM to FIM upgrades easy.  You can actually copy your ILM rules extension DLLs from an ILM box running Windows x86 to a FIM box running Window x64 and run the same DLL without recompiling.

This works really well for both Rules Extensions, and my favourite, ECMA Extensions.

I found a slight twist on this lately where the error message was less then helpful.

A C# Project referenced by a Rules Extension project was configured with a Platform Target of x64.  In Visual Studio the project options for Target Platform are:

  • x86
  • x64
  • Any CPU (the default)

Anyhow, when ILM tried to load the rules extension it failed with

System.IO.FileNotFoundException: Could not load file or assembly 'HoofHearted, Version=1.2.0.0, Culture=neutral, PublicKeyToken=null' or one of its dependencies. The system cannot find the file specified.

Turning on Fusion logging revealed that it was looking for a file that clearly existed!  The file DID exist, but it was compiled with the wrong target so ILM could not load it.  While FIM would have loaded it just fine, ILM is still only to load x86 assemblies (which is quite fair).

So the moral of the story is: leave the ‘Target Platform’ as the default (Any CPU) unless you have a good reason to do otherwise.  And if you do specify x64 then keep in mind that you cannot move that DLL back to ILM without re-compiling.

BTW – this has very interesting ramifications for teams that ship ILM DLLs but don’t officially support FIM, but you didn’t hear it hear ;-)

Thursday, August 05, 2010

Using the FIM Cmdlets to Copy an Object

The FIM Configuration Migration Cmdlets are great for migrating FIM Service configuration between environments.

What if you only want to copy an object or two?  You shouldn’t have to go through the whole song and dance of Export1, Export2, Join, Compare, Import.

One way to do this is to use the Quest FIM Cmdlets that are based on the FIM Web Service.  They are more general purpose, and most importantly the output of the Get- Cmdlet will work as input to the Set- Cmdlets.

Can’t or won’t have the Quest FIM Cmdlets installed? You can do the same thing (mostly) with the FIM Configuration Migration Cmdlets.  The trick is instead of comparing two FIM objects with Join-FIMConfig, just compare to $Null.  It looks something like this:

1. $exportedObject = Export-FIMConfig –CustomConfig “/Person=’HoofHearted’”

2. $matches = Join-FIMConfig –Source $exportedObject –Target $Null –DefaultJoin “DisplayName”

3. $objectToImport = $matches  | Compare-FimConfig

4. Import-FimConfig $objectToImport

Feed Memory to ILM by Enabling PAE and AWE

In the days when memory was expensive and limited the best way to get ILM to perform was to get really fast storage. 

Now we have servers with incredible amounts of memory (at least I do lately, yee-haw!) so getting a huge performance boost can be as easy as enabling PAE in Windows, and AWE in SQL.

This only pertains to ILM, since it runs only on x86 and needs these things to use all that memory.  In FIM we get this for free since it only runs on x64.

Tuesday, August 03, 2010

Sync and Merge

Doing a lot of Synchronization consolidation lately. 

It makes me really appreciate the FIM Configuration Migration PowerShell Cmdlets, which do the following:

  1. Import (Source)
  2. Import (Target)
  3. Diff (Join/Match)
  4. Merge (Export)

The FIM Configuration Migration Cmdlets are a lot more work than what we have in FIM Sync, which boils down to:

  1. Export
  2. Import

The problem is that there is a lot less functionality in the FIM Sync configuration tools (basically just Import/Export).  To fill the gaps I’ve had to do a lot of PowerShell scripting to perform the Sync configuration diff/merge work.  It has been really interesting, and the most challenging part has revolved around the use of the PowerShell Compare-Object cmdlet. 

The PowerShell Compare-Object cmdlet is REALLY useful but doesn’t treat all object types equally.  When simply comparing string arrays it works really well, but venture into other object types such as XML nodes and the results haven’t been predictable for me.  I’ve resorted to comparing String arrays, then using Select-XML to get the things I’m trying to merge.  A couple more steps, but the scripts work a lot better.

The moral of the story is: at first the FIM Configuration Migration Cmdlets seem to be a pain but the extra functionality will go a LONG way, especially when they are able to reach deeper into the FIM Sync Service.