Friday, December 18, 2009

Diagnosing side by side errors on x64 Windows 2008 R2

We all have software that we can't do without.  These packages are always the first ones installed when building a new machine or rebuilding an old one.  At the office, I have two computers and three monitors.  Gaze upon the awesomeness that is my setup:



In such a situation, the most common way to work is to use a hardware KVM switch, but not plug in the monitors.  This isn't ideal to me as I like being able to move through the screens a bit easier.  I also have the need from time to time to cut and paste between the two systems, and a KVM doesn't give me that option.  Historically, I'd used an open source package called Synergy.  This is a software "KM" switch that allows one mouse and keyboard to control two computers.  You simply move the mouse between the two screens and it works very slick.  Unfortunately, the original project hasn't been updated in quite some time and hasn't kept up with modern OSes.  One issue I had for a while was one of my machines was 64-bit, and Synergy doesn't support it very well.  Fortunately, there is a fork called Synergy+.  The current version primarily is released as a bugfix package to eliminate some of the worst issues with the old fork, as well as get it working on modern OSes. 

I recently migrated the machine on the right from Windows 2003 Server to Windows 2008 Server R2, 64-bit Edition.  Since Synergy+ actually had a 64-bit build, I figured I'd give it a shot.  My left machine already runs Windows 2008, 64-Bit Edition and Synergy+ works just fine.  Unfortunately, it did not on my new build.  For those who know, 2008 R2 is not just the latest release of 2008.  It's essentially a whole new OS, akin to Windows 7.  We have run into a couple of packages that simply won't work with the R2 release because of this, and I thought Synergy+ was one of them. 

On launching the configurator, I'd get an error "The application has failed to start because its side-by-side configuration is incorrect".  A quick Google of this error lead to some good information on what causes it, and how to fix it.   The Wikipedia article on SxS gives a good overview of how Side by Side works.  The issue that generates the error message I was looking to resolve then is due to the manifest for the Synergy + executable being wrong or pointing to the wrong location.  So, now all I had to do was find out what it was pointing to and I should be able to resolve it. 

The OS provides a tool called sxstrace.exe for diagnosing these kinds of issues.  First, we run it in trace mode and have it capture SxS information:

sxstrace trace -logfile:trace.log

Then, launch the offending executable, in this case Synergs.exe.  This produces a binary log file that we have to translate into a text file so we can read it.

sxstrace parse -logfile:trace.log -outfile:trace.txt

The highlighted line below shows what we're missing.  It's the VC 9 runtime redistributable.


=================
Begin Activation Context Generation.
Input Parameter:
    Flags = 0
    ProcessorArchitecture = AMD64
    CultureFallBacks = en-US;en
    ManifestPath = C:\Program Files (x86)\Synergy+\bin\launcher.exe
    AssemblyDirectory = C:\Program Files (x86)\Synergy+\bin\
    Application Config File =
-----------------
INFO: Parsing Manifest File C:\Program Files (x86)\Synergy+\bin\launcher.exe.
    INFO: Manifest Definition Identity is (null).
    INFO: Reference: Microsoft.VC90.CRT,processorArchitecture="amd64",publicKeyToken="1fc8b3b9a1e18e3b",type="win32",version="9.0.21022.8"
INFO: Resolving reference Microsoft.VC90.CRT,processorArchitecture="amd64",publicKeyToken="1fc8b3b9a1e18e3b",type="win32",version="9.0.21022.8".
    INFO: Resolving reference for ProcessorArchitecture amd64.
        INFO: Resolving reference for culture Neutral.
            INFO: Applying Binding Policy.
                INFO: No publisher policy found.
                INFO: No binding policy redirect found.
            INFO: Begin assembly probing.
                INFO: Did not find the assembly in WinSxS.
                INFO: Attempt to probe manifest at C:\Windows\assembly\GAC_64\Microsoft.VC90.CRT\9.0.21022.8__1fc8b3b9a1e18e3b\Microsoft.VC90.CRT.DLL.
                INFO: Attempt to probe manifest at C:\Program Files (x86)\Synergy+\bin\Microsoft.VC90.CRT.DLL.
                INFO: Attempt to probe manifest at C:\Program Files (x86)\Synergy+\bin\Microsoft.VC90.CRT.MANIFEST.
                INFO: Attempt to probe manifest at C:\Program Files (x86)\Synergy+\bin\Microsoft.VC90.CRT\Microsoft.VC90.CRT.DLL.
                INFO: Attempt to probe manifest at C:\Program Files (x86)\Synergy+\bin\Microsoft.VC90.CRT\Microsoft.VC90.CRT.MANIFEST.
                INFO: Did not find manifest for culture Neutral.
            INFO: End assembly probing.
    ERROR: Cannot resolve reference Microsoft.VC90.CRT,processorArchitecture="amd64",publicKeyToken="1fc8b3b9a1e18e3b",type="win32",version="9.0.21022.8".
ERROR: Activation Context generation failed.
End Activation Context Generation.


At this point, all I had left to do was to simply head over to Microsoft's site and download the SP1 version of the 64-bit redistributable and Synergy+ launched right up.  A quick solution to a simple problem. 

Sunday, November 15, 2009

It's all a series of tubes, Part 3: I'll bet you think this Pipe is about you

In Part 1 of this series, I discussed how I was looking to simplify managing my social networking content, both incoming and outgoing.  I talked about how Ping.fm made updating statuses for multiple sites as easy as it had been for just one.  In Part 2, I talked about using Yahoo Pipes as a way to aggregate incoming RSS streams from my Twitter follows into one feed that I could then enter into a feed reader.  This part is all about my favorite subject: ME!  I'm going to create what I've been calling a "vanity feed". 

 

As mentioned in Part 1, I write quite a bit of material that ends up on the Interwebs.  Every time I write something new, I will generally post a link in Facebook so my friends and family can bask in my brilliance.  I also want it posted to places like Linked In so my professional contacts can be amazed with my talents as well.  What?  It's called a vanity feed for a reason!  :)  A problem I have, though, is that Facebook only allows you to subscribe to one RSS feed.  Also, with to respect to my fellow Tuners, while I want all of my posts from iPhoneTunes.net to show up I don't want anything else.  So, let's get started.

 

As before, we're going to start with a Fetch Feed source.   But, I don't want the whole iPhoneTunes.net feed, so we're going to need to filter that.   Expand Operators and looky there...a Filter tool!   Just like most other tasks in Pipes, this tool is fairly straightforward to use.  The site owner of iPhoneTunes.net always puts our names at the top of any reviews we do.  So, if I click the Fetch Feed tool, I see the whole feed in the debugger.  But, if I pipe the output of the Fetch Feed into the Filter, which I've told to permit only those items that have my name, we see the only thing to come out of this Pipe is my latest review.  One feed down, and this whole process took less than two minutes to do.  Not too shabby, right?

 


Now we turn to the other feeds.  Those area a bit more straightforward as I really just want the output from them.  So, add another Feed Fetch source and enter the URLs to the other two feeds into it.  But, now we have two sources and if you look at the Pipe Output tool, you'll see it's only got one input!  Fortunately, there's an easy solution: the Union tool.   This simple tool provides multiple inputs and a single output.  If we attach it to the Pipe Output and then highlight that, we see we've got my one iPhoneTunes.net entry and all of the entries from this blog.  It pulled all of them, so you can't see the TEDxRochester blog items in the debugger, but they're there.



But, this does present an issue.  See, aside from the posts in this "Series of tubes" series, I've already posted all of the above items to the places I want to feed with this mashup.  If I publish this feed, it's going to repost everything I've already posted and annoy a lot of people.  That's not a good thing!  I'm trying to show my tech skills and I go and do a multi-post.  What I need to do is filter even further and only take out anything that was posted prior to today (11/15/2009).  In theory, once I'm done, the only things that should be in my debugger window should be these three blog posts.  So, let's add another filter box:

 


 

As you can see, we need another special tool, the Date Builder.  I honestly don't know why I can't just type a date into that filter field, but that's the way Pipes does it.  The output from this source doesn't go into the input of another tool, it goes to a data connection.  In the picture above, it's not clear, but this field is connected to the little dot to the right of the rule in the bottom filter.  And, that's it.  For the heck of it I add a Sort operator so that multiple items posted in one day will show up in the order posted, but that's it.  The only output I get are the two blog entries for today...three as soon as I publish this one.  Going forward, if I post a new review, update for next year's TEDxRochester or just want to share some knowledge here, it'll go to all of my social networking sites without my having to do a thing.

 

Knowledge management is always easiest when you don't actually have to do anything. :)

 

 

It's all a series of tubes, Part 2: And what is a Pipe if not a tube?

In Part 1 of this series, I talked about how I've started putting out a lot more information to different groups and how I'm using Ping.fm to make my outgoing flow easier to manage.  Now I needed to take care of everything coming in.  As mentioned previously, I'm using an RSS reader (Google Reader, to be specific) to manage news, and I wanted to try and leverage that technology.  If for no other reason than I look at my reader multiple times per day to keep up, it seemed to be my best option. 

 

Twitter seems to concur since they export everyone's tweets as an RSS feed.  I decided that making these feeds easier to manage would give me the opportunity to try out a technology I'd read about a few years back, but never really put any effort into: Yahoo Pipes.  Yahoo describes Pipes as "a powerful composition tool to aggregate, manipulate, and mashup content from around the web".  With no programming, you can use Pipes to aggregate or filter data from the web in a way that's meaningful to or works for you. 

 

Clicking the Create Pipe button brings us into the Pipe editor.  Like I said, there's no programming involved in the creation of a Pipe.  You simply drag operators or inputs from the left onto the canvas and you can visually see how your data will flow and be manipulated.  At the bottom is the debugger which will show you what your data looks like at any step in the process.   Since this first Pipe I'm creating is just to aggregate multiple Twitter feeds into one, I've dragged the Fetch Feed tool onto the editor page.  At the bottom, another operator appears called Pipe Output.  This represents your finished product.

 


 

Now that I've got my fetcher added, I need to populate it.  I head over to Twitter.com and visit the pages of those people I'm interested in following most closely.  On everyone's page, you'll find a link to an RSS feed.  Simply right-click it and choose "Copy link location".  Paste the first one into the URL field of the Fetch Feed bubble.  To add additional feeds, click the "+" above the URL box to add an additional field. 

 


 


If I click on the Fetch Feed source, it turns orange.  The really interesing bit happens in the debugger, though:



 The fetcher, once highlighted, does its job and you see the result.  It has fetched all of the items from the RSS feeds and is showing me what my output would be if I ended the chain here.  Unfortunately, at the moment, only one person has tweeted recently enough to show you.  But, were there tweets from other people, you'd see all of the intermingled into one continuous stream. 


Since this is just supposed to be a simple aggregator, I'm going to end it here.  To do so, simply click the little bubble on the bottom of the Fetch Feed source.  You'll see the bubble on the top of the Pipe Output tool turns orange.  This is to let you know that it's an acceptable place to terminate the pipe that's now coming out of the bottom of the feed fetcher.  Simply drag that pipe down and drop it onto the output tool and a link is formed.  If you then click on the Pipe Output tool, you'll see the debugger refresh and the same data as before will display.  That's it, we've now got a simple aggregator for our Twitter feeds.  Hit the Save button, give it a name and we're done!



 So, now what do we do with this?  Well, click on My Pipes at the top of the page.  Your new Pipe will show in the list.  Click it, and you should see the tweet feed.  All you need to do now is right click the "Get as RSS" link, choose "Copy link location" and paste it into your feed reader.  If you use Yahoo or Google Reader, it's even easier: just click the button.


Now, this was just a simple first Pipe to get our feet wet, but it really shows how powerful this tool can be.  It gets even more powerful when you start to take advantage of the filtering capabilities as well.  But, that's for Part 3...


 


 

It's all a series of tubes, Part 1: Ping 'em all and let THEM sort it out.

Recently, I became the licensee for an exciting event called TEDxRochester.  As a result of this event, I've begun networking with a lot of the great folks in my area.  Prior to this, I'd used social networking sites primarily as a vehicle to share pictures of my daughter with friends and family.  I had a Linked In account, but it barely got any use.  Now that I've got a lot of folks to keep up with, and who are interested in keeping up on me, I'm working on changing that. 

 

One aspect that's taken a bit of my time is aggregation.   I've started following a bunch of people on Twitter and I've ramped up my connections on Linked In.  Aside from this blog and the one at TEDxRochester.com, I also write reviews for iPhoneTunes.net.  And, on top of that, I've still got my private connections on Facebook to keep track of.  Information overload, here I come!

 

I use an RSS feeder everyday, so it only made sense that I should try to find a way to leverage what I'm already using to keep track of people.  When I first created this blog, I added it to my Linked In profile via its RSS feed.  After all, as the first entry stated, this blog was to serve as a living resume for potential future employers should I ever have need of one.  (Yes, I know, I should probably update it more often then.)  At the time, you could only add one blog or RSS feed, and I wanted the ability to aggregate everything I write into one, easy to use feed.  I also want to keep these many different groups informed of my doings, but it has to be easy to manage.  After all, I've already got a full day.  Adding more "work" just to keep track of people just wouldn't fit.

 

Since I knew that all of these activities would result in my needing to update my statuses a bit more frequently, I decided to start by simplifying that task.  For example, if I wrote a new review at iPhoneTunes.net, I'd typically post a link to Facebook & Linked In.  But, now that I've actually become active on Twitter,  I would need to tweet this new review as well.  If I wrote a new article for the TEDxRochester blog, I'd have to post it to Linked In, Facebook, the TEDxRochester fan page on Facebook AND Twitter.  I just knew there had to be an easier way to do all of this, and there is...it's called Ping.fm.

 

Ping.fm acts as a central dashboard to the status boxes for all of your social networking sites.  At last count, they support some 60ish services like Facebook and Linked In, but also some distant, esoteric ones I've never even heard of.  For me, flexibility was key, and Ping.fm provides it in the form of posting groups.   These allow you to tie different social networking sites together so you can have granular control over who sees what.  There are some things that I only want to share with a very limited, private group, and there are some that the whole world can see.  I accomplished this by creating the following groups:

 

Professional: This group is where I post things that I want to be available to everyone.  Anything posted to this group is visible to the world, so family vacation pictures wouldn't go here, but if I found an interesting article about some new technology I could post the link in this group.  Anything posted here goes to Linked In and Twitter.  It also goes to Facebook as well.  After all, being a geek, I have a lot of geek friends who might not pay attention to my Linked In profile.

 

Private:  For the moment, this one just goes to Facebook.  I could do Facebook by itself, but I'm getting myself in the habit of using the group instead, just in case I decide in the future to add something else. 

 

TEDxRochester:  This is where I post news about next year's event.  I also post random things that might in some way be related.  For example, my wife and I were watching The History Channel's "The Universe" last night and Adam Frank, who is regularly on the show, was explaining about pulsars and quasars.  I watch shows like this all the time where they bring on experts, but since Adam was our first speaker at the inaugural event, he was the first one I've ever met from one!  So, I posted a quick item about it.  This one's my biggest group, it posts to Facebook, the Facebook fan page, Linked In and Twitter. 

 

I also have a del.icio.us account, so any time I post a link to any of the above groups, it's automatically added to my bookmarks.  How cool is that?

 

It didn't take long before I fell completely in love with Ping.fm for updating my statuses (statusii?).  It's very well laid out, very easy to use and free.  I now spend as much time updating up to five different sites as easily as I would any one of them, if not easier.  If you've got a lot of social sites to update, you owe it to yourself to check it out. 

 

Okay, now that I've got outgoing information organized and easy to control, it was time to tackle incoming.  Being only one person, though, outgoing was a lot easier...

Monday, March 9, 2009

Are you free, Mr. Kay?

At the office, we use Exchange.  I've been a fan of Exchange since the first time I administered an Exchange 5.0 server for 200 users.  Now, I work on a team that supports 15,000 users!  Being the third level support for Exchange, the members of my group use Outlook a little more intensely than your typical worker.  But...not everyone. :)

For example, if we're going to be on PTO, aside from letting our supervisor and HR know, we have to let the team know.  We could easily setup a team calendar to handle time off information, but we find it's just easier to send each other all day appointments with titles like "Tony - PTO".  What we normally do is set the busy information to "Free" before we send it out so that it doesn't block off our coworkers days.  Sometimes, though, folks forget to do that and I'll end up with entire weeks blocked off and project managers screaming because they can't find any free time in my schedule.  Deciding today I'd had enough, I wrote a little program to take care of this for me.  What the below VB does is open my Outlook calendar and iterate through all of the items in there.  It checks the subject line, and if it contains "PTO", it then checks to make sure it's set to free (BusyStatus of 0).  If not, it changes it, saves it and then writes the subject/free/busy again to confirm it's been changed. 

Imports Outlook = Microsoft.Office.Interop.Outlook

Module Module1

    Sub Main()
        Dim objOLApp As Outlook.Application
        Dim objOLNS As Outlook.NameSpace
        Dim objCalendar As Outlook.Folder
        Dim colItems As Outlook.Items
        Dim objappointment As Object

        objOLApp = New Outlook.Application
        objOLNS = objOLApp.GetNamespace("MAPI")
        objOLNS.Logon("", "", False, True)
        objCalendar = objOLNS.GetDefaultFolder(Outlook.OlDefaultFolders.olFolderCalendar)
        colItems = objCalendar.Items
        For Each objappointment In colItems
            If InStr(objappointment.Subject, "PTO") Then
                If objappointment.BusyStatus <> 0 Then
                    Console.WriteLine(objappointment.Subject & " " & objappointment.BusyStatus)
                    objappointment.BusyStatus = Outlook.OlBusyStatus.olFree
                    objappointment.Save()
                    Console.WriteLine(objappointment.Subject & " " & objappointment.BusyStatus)
                End If
            End If
        Next
    End Sub

End Module


It's not the most efficient code, I know.  But, it went through my calendar in about 30 seconds and changed a few hundred entries.  I had planned on using it just this once, but it might be a good thing to schedule to run once a week or so.  Now, I have LOTS of free time for my PMs to fill up!  Oh, wait, that might not be a good thing.... :)


Thursday, March 5, 2009

Mining event logs for useful information

We're having an issue where randomly users will be disconnected from their Citrix session.  It doesn't happen a lot, but on the aggregate it's becoming a nuisance for our user community.  Unfortunately, we don't get good information from our users to help pinpoint the issue.  When we ask them to try and track the times it happens, we'll get one or two notices for a day or two and then it dwindles off.  In perusing the event logs, I found that each disconnect is actually logged as an event from the source "Metaframe" with an EventID of 9007.  The message of the event contains the user name. 

So, if I could write a utility that will cull the useful data from the logs, I wouldt need to rely on users.  I can find the exact date and time of each disconnect and with the help of our Network team, we might be able to pinpoint where the issue actually lies.   So, to that end I did a little research as I'm familiar with reading frmo the eventlogs with VBscript, but I'm trying to do more with VB.net since I can use the VB 2008 Express Edition for free.  The first issue I encountered was that pulling EventIDs is a deprecated propert, and I had to use the EventLog object's InstanceID property instead.  The results recieved from this property need to have the top two bits masked off in order to get the EventID.   That's what's going on in the "intEventID = objentry.InstanceID And &HFFFFFF" line:

    Public Sub ListEventLog()
        Dim objLog = New EventLog()
        Dim intEventID As Integer
        Dim strUsername(), strEventMessage, strDate, strTime, strDateTime() As String

        objLog.Log = "System"
        For Each objentry In objLog.Entries
            intEventID = objentry.InstanceID And &HFFFFFF
            If intEventID = 9007 Then
                strEventMessage = objentry.Message
                strUsername = strEventMessage.Split("\")
                strDateTime = Split(objentry.TimeGenerated, " ")
                strDate = strDateTime(0)
                strTime = strDateTime(1) & " " & strDateTime(2)
                Console.WriteLine(strDate & "," & strTime & "," & strUsername(1))
            End If
        Next
    End Sub

Once compiled, you simply run this utility piping the output to a text file.  You then have a nice CSV you can pull up in Excel to process the data with.

Monday, February 9, 2009

Some basic MFCOM scripting

In the environment I support, we have production servers and BCP servers (Business Continuity Planning). The BCP servers are duplicates of the production servers, and both share SAN disk via an SRDF link. When, for example, we have a hardware problem with a production server, we failover the SRDF link, repoint the Citrix published applications to the BCP server and voila! Service restored. The problem is, in this environment, each server has about 8 published applications and changing the defined servers manually used to take 2-3 minutes each due to the slow nature of the old Citrix Management Console. 8x3 = up to 24 minutes just to repoint applications. Further complicate that with the fact that each server can have up to 3 hubs (24x3 = you get the idea). So, after doing this once or twice, I decided it was time to learn how to script using MFCOM, Citrix' COM interface to Xenapp. Below is a script I cobbled together quickly.

Each server serves one or more hub locations in our environment, and the hubs are consistently numbered with a four-digit number that begins with 0. So, the below script (which is very inefficient, I know) simply finds every application in the published application list that has that hub number in the title. Once it finds these apps, it looks at the server list from which the app is published, and if it's a production server, replaces the server with the BCP and vice versa. What used to take 16-24 minutes now takes 16-24 seconds.


Set objFarm = CreateObject("MetaFrameCOM.MetaFrameFarm")
objFarm.Initialize(1)

strHubNumber = Wscript.Arguments.Item(0)

For Each objApp In objFarm.Applications
objApp.LoadData 1

If Instr(objApp.Appname,strHubNumber) Then
strAppDN = objApp.DistinguishedName
Wscript.Echo strAppDN

For each server in objApp.Servers
If Instr(server.ServerName, "MMSPC") Then
strNewServer = Replace(server.ServerName, "MMSPC", "MMSBC")
Wscript.Echo vbTab & "Replacing " & server.ServerName & " With " & strNewServer & vbCRLF

Set objAppSrvBind = CreateObject("MetaFrameCOM.MetaFrameAppSrvBinding")
objAppSrvBind.InitializeByName strNewServer,objApp.BrowserName
objApp.AddServer objAppSrvBind

ElseIf Instr(server.ServerName, "MMSBC") Then
strNewServer = Replace(server.ServerName, "MMSBC", "MMSPC")
Wscript.Echo vbTab & "Replacing " & server.ServerName & " With " & strNewServer & vbCRLF

Set objAppSrvBind = CreateObject("MetaFrameCOM.MetaFrameAppSrvBinding")
objAppSrvBind.InitializeByName strNewServer,objApp.BrowserName
objApp.AddServer objAppSrvBind
Else
Wscript.Echo "Server doesn't match naming convention"
End If

objApp.RemoveServer(server.ServerName)
objApp.SaveData

Next
End If
Next

Thursday, January 29, 2009

Third party root certificates on the iPhone

For years I've run my own web server on one of my home systems.  I've got a home automation system with a web interface, I use the web interface in Azureus, etc.  Since I tend toward security, I have it all piped through SSL.  But, when I first started, the big vendors weren't offering cheap certs for individuals.  Fortunately, along came CACert, a provider of free certs usable by anyone.  The only drawback to a CACert is their root certificate isn't installed in any mainstream browser.  So, while the actual communications will be encrypted, you'll get a message when first entering your site that the cert can't be trusted.  It's a minor inconvenience, but the fix is even simpler than ignoring the problem: On your desktop, simply browse to CACert's site and click the link for "Root Certificate".  Click the link for "PEM Format" and the browser will ask if you're sure you want to trust certs from CA.  Select the applications you want to trust and hit Ok. 

When I tried to do this in Mobile Safari on my iPhone, I kept being told it couldn't download the PEM file. So, I did some searching and I found the solution: e-mail yourself the cert.  Follow the same steps as above, but instead of clicking the "PEM Format" link, you're going to want to right-click the "DER Format" link and use your browser's method of sending links.  When you get the e-mail open the attachment, accept that you trust the cert and you'll be good to go. 

While the solution was small, can you believe I had to compile it from three different sources to get it to work right? :)


Thursday, January 15, 2009

Panic mode: Missing VMDK files for an ESX guest

At my POE, we have a huge VMWare infrastructure.  About 30% of our servers are actually VMs.  This isn't just test servers, but production as well.   Last night, I was performing some routine maintenance on a VM owned by one of our support and test groups.  I was adding a second NIC to their VM so they'd be able to get on the TAN and start restoring production environments so they could duplicate issues.  A relatively simple task, I powered down the VM, added the NIC and powered back on.

 

Or, should I say...hit the power button.  The machine wouldn't come back up, I kept getting  "File not found" error.  So, I tried the logical thing, I removed the NIC and tried again. Same results.  Hmmm..  So, now, I needed to do some real investigating.  Since it was a test machine and getting very late (and I've got a bad cold), I put it off until this morning.  Taking a closer look at the event log, I found the problem: "VMware server cannot find the virtual disk /vmfs/blahblahblah".  Uh-oh.  That disk is the first image in a group of disks that makes up a 450G volume in the VM.  Did I mention this machine wasn't on the TAN, and so not backed up?

 

 I fired up Putty and did some poking around.  This machine actually has 4 disks in it.   And sure enough, all of my flat files were there, but the descriptor files were missing.  For those unfamiliar with the specifics, in newer versions of VMWare, virtual disks are actually made up of two files: disk.vmdk which is just a text file called a descriptor containing the specifics of the disk.  Size, type, geometry, etc.  There's another vmdk file, disk-flat.vmdk that is the actual flat file that holds your data.  Personally, I think it would make more sense if they used different extensions, but that's just me.

 

I did some poking around and found that disappearing descriptors is a relatively common problem these days.  The solution is relatively easy, but not something *I* could do easily.  I had to create a new virtual disk, copy the descriptor from it and just edit it to point to the right flat file.  The problem is, the first disk that was missing its descriptor is 250G and I've got 50G free on this standalone host.  I'd have to recreate on another host, copy over, etc.  But, at the last moment I found this:

 


 

Simply enter the size in bytes of the flat file (copy...paste) and the name of the flat file (copy...paste) and it generates a new stub that you can...you guessed it...copy, paste into VI on the host.  I took a moment to try recreating a couple of vmdks that I did have, just to make sure I got exactly the same results using this tool.  I'm not sure if it makes a huge difference or not, but esXpress generator always populates with the line:

 

ddb.toolsVersion = "0"

 

And, all of my descriptors have this one:

 

ddb.toolsVersion = "7299"

 

I don't think it would've made any difference, but I edited all of mine so the version number was 7299.  Fingers crossed, I powered up the VM and it came right up.  Chkdsk found no errors on the volume, and we're back in business.   Thanks so much to esXpress.  You saved me a ton of work this morning!

 

 

Tuesday, January 13, 2009

Automated Active Directory Distribution List Creation

One of the skills that I've found to be most helpful in my day-to-day as a Windows admin is knowing how to script well.  One of the running gags from most Unix guys is that Windows is completely unscriptable.  They forget, of course, that any scripting language that runs on their OS of choice most likely works on mine.  Wink  That being the case, I generally stick to one of the VB variants when doing what I need to do.  I stick to them for a couple of reasons, most particularly that they're very easy to use and learn.  I've also found that most situations I might encounter have been overcome by someone else and I can count on one hand the number of times I've encountered a solution that wasn't written in VB or VBscript (most often it's C# or Powershell).  

The latest little project to cross my desk was a need to create almost four dozen distribution lists in AD.  I could've done them all by hand, but usually it's easier to write a script.  To that end here's my solution, with a bit of info on how it works. 

' Setup some variables for use later on.
strParentDN  = "ou=DistributionGroups,ou=Exchange,dc=domain,dc=com"
strGroupName = "THE_" & Wscript.Arguments.Item(0) & "_DAEL"
strSMTP = strGroupName & "@domain.com"
Const ADS_GROUP_TYPE_UNIVERSAL_GROUP = &h8

set objOU = GetObject("LDAP://" & strParentDN)
set objGroup = objOU.Create("group","cn=" & strGroupName)
objGroup.Put "groupType", ADS_GROUP_TYPE_UNIVERSAL_GROUP
objGroup.Put "sAMAccountName", strGroupName
objGroup.MailEnable

' Here I'm going to "finalize" some of the settings for the DL.  The reason for this
' is I've found that until a group is created and mail enabled, you can't set an
' SMTP address for it.  Guess it makes sense...
objGroup.SetInfo

' But, since the group's already opened as an object, I can simply continue
' to add attributes to it.
strDN = "cn=DL Management,ou=Groups,dc=domain,dc=com"
objGroup.Put "mail", strSMTP
objGroup.put "targetAddress", strSMTP

' This says who can add/remove members from the group.  But, read on...
objGroup.Put "managedBy", strDN
objGroup.SetInfo

Set objShell = CreateObject("Wscript.Shell")

' If you look on the Managed By tab on the DL's Properties sheet, you'll see a checkbox
' "Manager can update membership list".  Just because you've given a user (or in this case a group)
' management of that list doesn't mean they can manage the membership.  Now, we could do this
' the right and approved way, which involves setting up DACLs and ACLs for the object
' and all kinds of other voodoo.  I find it's easier to just use some of the tools provided by MS.

strCMD = "dsacls " & "cn=" & strGroupName & "," & strParentDN & " /G domain\dlmanage:WP;member"
objShell.Run strCMD

' Let the user know it worked!
Wscript.Echo "Successfully created mail-enabled DL."

Not much to it, but someone might find it useful.  And, it certainly demonstrates how easy it really is to script AD.   19 lines of actual code to create 47 distribution lists.  I guess that's a fair trade!