.\Matthew Long

{An unsorted collection of thoughts}

Designing Operations Manager management pack Discovery models to best support derived classes.

Posted by Matthew on March 6, 2012

Inspired by a previous post and some of the recent activities of friends and colleagues, I thought i’d share some recommendations on an important consideration you should keep in mind when designing the discovery model of your System Center Operations Manager 2007/2012 management pack.  This blog post should help you author your MPs in such a way that they are much friendlier to future expansion/customisation, whether by you or your customers.

Class Inheritance

As you may or may not be aware, in Operations manager objects (known as classes) can be “extended” in the GUI to support additional custom attributes that may be useful to your organisation.  You may have also noticed that many management packs implement a common object for a technology (such as an IIS Website or a SQL Database) that is then further specialised into every version you might encounter (IIS 6, IIS 7 etc).  You can construct a view or group that displays a version specific object (IIS2008Website) or any object that is based on that common ancestor (IISWebsite) which will then be version agnostic and display all IIS websites regardless of version.

This all happens because of class inheritance.  Every class in Operations Manager has a single Base class from which it stems, going all the way up to the common ancestor for all classes, System.Entity.  Every class inherits all class properties from its parent class (and any properties of their base class, known as ancestor classes) all the way up the chain to System.Entity.  This is why every object in SCOM has a Display Name property, because System.Entity defines it and every object is sooner or later based off of that class.

Operations Manager automatically considers any object to be an instance of itself but also an instance of any of its base classes.  So a SQL 2008 instance is itself a SQL Instance which is also a Windows Server role.  When you extend a class in the GUI (say, Windows Computer) what you are actually doing is deriving a new custom class that is based upon that parent class with your custom properties.

Discoveries and Targeting

Ok,  so with that out-of-the-way, why would you want to extend a class (or if you are authoring an MP, derive from an existing class)? Well reasons may include:

  1. Adding an attribute (Class property) that would benefit your organisation, such as Owner Contact details for a Server.
  2. Providing support for a version of an object that wasn’t previously included by the MP vendor/Author
  3. Speeding up creation time of your own MP by removing the necessity to define common properties over and over again.
  4. Allow targeting of monitoring workflow, Relationships and views at a group of objects, regardless of their specific version or implementation.

That last one is a critical point for MP Authors, as if I use the Server Role class as my base class when creating my server object, it automatically inherits all the relationships that will insert it into the Health rollup model for a Computer object.

Right enough rambling, now to the crux of the matter.  When you derive or extend an existing class SCOM may automatically give it all the property definitions of its parent classes but it doesn’t get the values of those properties automatically.  If you decide to go and make a SQL 2012 MP unless the discoveries for the existing objects have been setup in a certain way, all the inherited properties such as Database Name will be blank and it will be up to you to implement a discovery for them.

This is because discoveries are usually targeted either at the component that hosts them (Server role discoveries are usually targeted at the computer that runs them, database discoveries at the DB engine that hosts them) and they create the object with all of it’s properties discovered.  When you extend a class or derive a new one, they have no idea that your new class exists so they just leave it be.

The better option here (see caveats and considerations below) is to target a discovery at the component that hosts your class and a discovery at the class itself to discover its properties.  That way when you or someone else derives your class into a new version, your expertise at finding and populating the original properties is put into work because the discovery targeting sees the new class definition as being an instance of the base class that it was designed to populate.

Sample discovery model

No doubt some of you have taken issue with my claim because in your experience deriving or extending a class does automatically populate all the existing properties with values.  More than likely, this is because you’ve worked with a discovery configuration like I’ve described above without knowing it, such as with the Windows Computer objects.  There are a series of Discoveries targeted at Microsoft.Windows.Computer in the core operations manager MPs that are responsible for discovering properties such as logical CPU count and whether or not the computer is actually a Virtual (Cluster) instance.  Since pictures are better than words (and I’m not far off one thousand already) here is a diagram that explains what I’m talking about.

Default Implementation of Microsoft.Windows.Computer discovery model

The diagram above actually rolls several different property discoveries into one object, but hopefully you get the idea.  Now if I were to extend (via the SCOM GUI) or derive (MP authoring) the Computer class with my own custom version containing a new property, I would only be responsible for discovering the existence of my class and any new custom properties I’d added.  Indeed, if I was following this approach here i’d implement my property discovery as a second discovery so that any class that extends or derives from MY class in the future also benefits from this.

The diagram below now shows this; I have added a discovery (which perhaps targets the computer and looks for the presence of a “System Owner” registry key) that is responsible for creating my custom class, and another which discovers my custom attributes I’ve added (reads the above reg key and populates that value onto the object).  It might look like a lot of work in the diagram but honestly in the authoring console this is very simple to do.

Custom Attribute extension and discovery

For those of you wondering how to make a discovery submit property information for an existing object, it’s extremely simple.  Just discover your class with its key properties again (you don’t need to re-test for your object, you know it exists already), along with all your newly found properties, and reference counting (something I talked about in my previous blog post) will take care of the rest.  Don’t worry about blanking out any existing properties that you don’t include, SCOM will leave those intact.  During un-discovery SCOM is also smart enough to handle your self referential discovery and make sure the object isn’t perpetually discovered once your component ceases to exist.

As an added bonus, implementing the discovery model in this fashion also allows you to separate out the discovery interval of your class from the discovery interval of your properties.  This can help reduce/prevent discovery churn and will allow your MP users to further customize their monitoring experience.

Caveats and Considerations

As always it seems there are some considerations you should take into account before doing this.  The first and most important is performance.  Whilst generally speaking performing two sets of queries (one to discover the object, one to create its properties) isn’t that taxing on most data sources, you might want to think twice about this if you are using a remote datasource that isn’t very well optimised.  Most of the time if you are doing WMI or SQL queries remote remember that your second query will usually be much cheaper since rather than looking for a set of matching criteria you are only looking for records that match your objects ID.  Likewise your first query to establish the existence of the object can be optimized not to request columns/properties that only the second discovery needs.

As I mentioned above, if performance is a concern you can control the intervals of your two discoveries and set them to something suitable.  Remember this is discovery not monitoring, you don’t need to update properties every 10 minutes.

The second consideration you may want to take into account is complexity.  If you are implementing a management pack with dozens of objects using custom datasources you may not want to implement an extra set of discoveries, especially if your objects only have a handful of properties.  That’s fine, you just have to make sure you balance the demands with the rewards of taking the above model on board.  If you don’t see yourself deriving lots of classes, or your customer’s wanting to extend your classes with your support, then you’re just saving yourself unnecessary effort.

In my opinion though, it’s nearly always worth it.  Feel free to leave a comment if you’d like to see a specifc example of this kind of thing implemented (either using one of the SCOM built in modules, or a custom script).


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s