Tuesday, April 26, 2011

Learn PowerShell in a Month of Lunches

I was lucky enough to attend PowerShell training by Don Jones at the TEC conference.  Also I talked a lot about PowerShell in my FIM sessions at the conference.

If you are looking for a good resource to ramp up on PowerShell I suggest his latest book:

Learn Windows PowerShell in a Month of Lunches.

Reading this book will not only teach you how to use PowerShell, it will make you better at designing and deploying FIM.

Thursday, April 21, 2011

Holy Kittens!! PowerShell at TEC

TEC adopted a new track this year, PowerShell!  This was an exciting addition, for a number of reasons:

  1. It was driven by the PowerShell folks at Microsoft (Jeffrey Snover and several other PowerShell people from Microsoft).
  2. PowerShell draws a cult-like following of really freaking smart IT people
  3. Quest has some serious PowerShell chops, so hosting a PowerShell deep-dive is a perfect fit

I was lucky enough to attend to pre-conference training by Don Jones.  If you have time for ANY training, I say take PowerShell training.  It will make you better at what you do. 

Wednesday, April 20, 2011

Logging in FIM Workflow Activities

The .NET Framework provides an awesome facility for logging.  Employing this logging facility in FIM Workflow Activities enables us to change the logging behaviour of our workflow DLLs by changing the FIM service config file (or some other way – it is really up to you at design time). 

By default this logging facility is configured in a .config file for the hosting process (in this case FIM) but it is possible to enable the logging configuration to be configured in the workflow itself.

Anyhow, I’ve employed this logging facility in the PowerShell WF Activity for FIM and I think it is a good example of how to do logging in a FIM workflow activity.  If you’re keen on logging, then please take a look and give me some feedback!

The XMA Grows Up

At TEC this week we learned about a new MA framework coming to a FIM server near you!

The nickname for the new framework is ‘EZMA’.  My first take at this nickname was irony; since the framework provides was more functionality than the XMA SDK.  This raises the bar for the development expertise required to develop an MA using the new framework (it is REALLY easier?).  Raising this bar is a good thing IMHO since it makes it less enticing for non-developers to produce MAs, which is probably a good thing.

After going through the new SDK I realized it really is an EZMA because the added functionality means I no longer need to do a lot of things/hacks in my MA code.  For example, the call-based import functionality relieves me of the need to transform m objects into a file.  I have a lot of DSML code that now gets to retire!  Another example, on import I can now provide an update (partial object) instead of having to supply the complete attribute set.  This means that a lot of code in the OpenLDAP XMA can also retire since it has to chase changeLog entries into the actual directory entry to get the full attribute set.

Brian Desmond did an awesome job explaining this in his session today.  He’s done the early adopter work of building some MAs using the new framework and was able to tell us about it at the conference today.

Reporting for FIM (and EVERYTHING else using SSRS and PowerShell)

Just got home from TEC 2011 in Las Vegas, and whoa what a great time!  There were lots of FIM gurus there, but new this year was a track dedicated to PowerShell.  I’m pretty excited about PowerShell, so the additional track created a LOT of conflicts for me.  My ideal TEC would be a couple days longer so I could see ALL the awesome talks, but I’m not sure my liver or wallet could handle it.


This is a fun feature gap in FIM to address because it helps us realize that we’re really just DBAs ;-)

My first approach to this challenge was to extend SSRS using a Data Processing Extension (DPE) to query the FIM web service.  We have customers using this successfully in production today, but I wanted to add functionality to it, including enabling reporting against FIM Sync.  Turns out PowerShell is REALLY good at getting objects, allowing you to flatten it down into something consumable by a report.  After some prototyping I was able to get a DPE working with PowerShell, so any PowerShell pipeline can populate an SSRS dataset.

I’ve posted this prototype onto CodePlex, and am using for some report challenges (including FIM).  Please feel free to download it, try it out, and provide some feedback!


Tuesday, April 05, 2011

Coming to TEC? You have to see my Reporting Session!

It’s obvious you have to go to TEC this year, each year the conference gets better and this year is huge with the addition of the PowerShell track.

Reporting is a rather dull topic to cover but I think I’ve got an angle that is really freaking exciting.

The original sin was FIM shipping without a reporting feature.  In response to this I put a solution together to tie SQL Server Reporting Services (SSRS) to FIM via the FIM web service.  Basically it would allow SSRS to get data directly from the FIM web service without requiring any intermediary database staging.  This was all good but I’ve been looking to solve some of the problems with that approach and my solution is the topic of my reporting session at TEC on Wednesday morning.

To hint at what I’ll be presenting; it has more to do with PowerShell than FIM, but I use it all over the place to do FIM reporting.  The SSRS component and some of the reports themselves will be posted to CodePlex this month, mostly because I anticipate working on them feverishly until Wednesday morning in Vegas ;-)

Goodbye [reflection.Assembly]::LoadFrom, Hello Add-Type!

One of the things I love about PowerShell is that almost everyday I learn something new that makes me better at what I do.  Not just better at PowerShell, but better at getting my stuff done (testing, deploying, automating, integrating, etc).  It’s like that excitement of finding $5 on the sidewalk.  You’re both happy to be a little richer but also harbour that sneaky feeling that you’ve stolen something, in this case you’ve stolen time, and maybe money too.

Anyhow, my $5 bill today is the discovery of Add-Type.

For too long I’ve been using [reflection.Assembly]::LoadFrom to load assemblies, which is analogous to adding references to a C# project.  It is a bit tedious and I don’t feel like I’ve ever mastered it, but now have found Get-Type to be way more user friendly and dependable.


### Before
[reflection.Assembly]::LoadFrom('C:\Windows\….<path shortened>\System.Security.dll'

### After
Add-Type -AssemblyName System.Security