My First Windows Update Gone Bad

I don’t think I’ve ever had a Windows Update go bad – until this week.

I recently upgraded to Office 2013 and late Monday afternoon, decided to check in with Windows Update prior to our company’s normal monthly patching to see how bad the damage would be. Nearly 1 GB of updates, thanks to my fresh Office install. But there were also a couple optional updates, including a .NET Framework update. So I figured I may as well go ahead and do everything while I was at it. This way I could control the reboot instead of being forced into one in the middle of important tasks.

Tuesday morning, I got a call asking if we were having any issues with one of our key systems. I fired up my trusty SQL Sentry client to check things out. And failed. I couldn’t connect to the server to start monitoring. Never a good sign. Then I tried SSMS 2012. Again, couldn’t connect to any servers. I got the following error:

Attempted to read or write protected memory. This is often an indication that other memory is corrupt.

That sounds pretty ominous. Try a few more times, no luck. Reboot, no better. Uninstall SQL Sentry Client and attempt to reinstall – still nothing. Things were going from bad to worse in a hurry.

I bounced the error messages off Kendal Van Dyke (b | t) and he suggested that it might be an issue with the SQL Native Client. So I reinstalled that from the SQL Server 2012 Feature Pack. And still, I couldn’t connect. I even tried PowerShell (v3), both the SQLPS module and SqlServerCmdletSnapin100 Snap-In and got the same errors SSMS 2012 threw out.

Taking a deep breath and stepping back, I started reviewing the situation. What do all of these have in common? They’re all using the latest (or at least a fairly recent version) .NET Framework. Let’s take a step back and try something older. SSMS 2008 R2 – works fine. Start up PowerShell v2 – that works too. Now we’re onto something!

The Framework update I installed was KB2858725, bringing it to version 4.5.1. My first thought was that maybe the installation was botched somehow, so I downloaded the installer and tried again. But to no avail. So I uninstalled it entirely and reinstalled 4.5 completely. This finally did the trick.

Due to other commitments, I haven’t had a chance yet to re-try the 4.5.1 update, but I’m also in no rush. While it won’t take me 4+ hours to troubleshoot and fix if it breaks things the next time around, I need my system stable for the next few weeks so I won’t be trying again until 2014.

T-SQL Tuesday #39 – Here’s what my PoSH is cooking

T-SQL Tuesday LogoMy first official entry for T-SQL Tuesday (my first was a guest post hosted by Kendal Van Dyke (b|t), so I’m not really counting it) is brought to you by PowerShell, or PoSH. Ever since I discovered PoSH and really dove into learning it a couple years ago, my co-workers have gotten a bit annoyed by my insistence upon using it for everything. It is my favorite hammer, and around me I see nothing but acres and acres of nails.

I’m not a DBA, so I don’t do as much managing of databases with it as most people joining this party, but I still use PoSH with SQL Server pretty often. I spend a lot of time pulling data from SQL Server & doing crunching/analysis, sometimes merging it with some filesystem data as well.

So what have I done lately?

  • Space usage analysis. I work with a system which generates PDF documents, saves them to the server filesystem, and records the details to a table. But how quickly are we consuming space? When will we run out? I whipped up a quick PoSH script to pull the details off the table, then locate the files on the filesystem, and record everything to another table for slicing & dicing.
  • Quick ad-hoc data dumps. Sometimes we just need to pull some data out of the database, crunch a few numbers, and send it off to someone upstairs. Before PoSH, I’d run the query in SSMS, copy the data, paste it into Excel, drop in a few formulas and maybe a graph, and be done. But I’d spend more time fighting Excel on formatting & getting the columns right than I did getting the data into it. Invoke-SQLCmd piped to Export-CSV solves that really quickly.
  • I’ve been working on upgrading a system we purchased from a vendor and migrating everything to a new server at the same time. Moving & updating XML configuration files, thousands of check images, restarting services, migrating databases. And this isn’t a one-time event – we have to do this over 200 times! The SQL Server portion of this process isn’t particularly involved, but it’s vitally important:
    • After most of the heavy lifting of moving things around is complete, one table has to be updated to point at the new path for the images that were migrated.
    • When all of the migrations are finished, we have to validate that everything moved over properly. A few carefully-selected queries to compare critical tables between the old version of the database and the new version and minds are put at ease that all of our data has come over cleanly. Those queries, along with the other components of the validation, are run via the PoSH script & output to a file for review.
  • For reporting purposes, we load a small subset of data from Active Directory into a pair of SQL Server tables on a nightly basis. Previously, it was only 2 fields but recently this has been expanded to about 8. Once again, PoSH to the rescue! Pull the AD user accounts, select the properties we need, and insert it all into the tables. Originally I used my old standby Invoke-SQLCmd, but with the addition of new fields I got concerned about building queries via string concatenation using arbitrary data retrieved from another system. System.Data.SqlClient & prepared statements to the rescue! It’s more code, but it’s a lot safer.
  • Dave Ballantyne reminded me that I have two PowerShell scripts for SSRS.
    • The first, based upon this script from Randy Alrdich Paulo to deploy reports into SSRS. My addition was the option to pull an RDL file straight from our Subversion repository instead of the person doing the deployment having to check it out.
    • The second is based upon this StackOverflow post, and is used to render reports to file. In one case, I then SFTP the generated report to a vendor immediately after.

These are pretty mundane compared to what I’m sure a lot of people participating will be posting, but the key is this: PoSH is saving me time and by scripting everything I can, I’m significantly reducing the chance for errors. Of course, when there is an error, it’s magnified tremendously – so it all gets tested against a non-production server first.