T-SQL Tuesday #104 - Code You Would Hate To Live Without

This month’s T-SQL Tuesday is hosted by Bert Wagner and he asks us to write about code we’ve written that we would hate to live without.

First off, “hate” is a pretty strong word so let’s go with “code you couldn’t bear to live without”. The first bit of code I couldn’t live without is reviled in some circles. More often it’s misunderstood and lamented, or…well, I’ll just show it to you.

Yes, you read that right. It’s an SSMS Snippet that generates a cursor. I use cursors so often that I decided to create a snippet so I don’t have to rewrite them or copy/paste from a template file all the time.

Yes, really. Don’t @ me, come at me bro, whatever it is the kids are saying these days. I am dependent upon cursors every day and would be lost without them.

Wow Andy, you must be pretty bad at your job if you’re running cursors all the time! Don’t you know that’s terrible for performance? What’s up with that?

If we’ve met, either in-person or virtually on the SQL Community Slack, you probably know that I manage an instance hosting mumblemumble thousand databases. I don’t mean “a couple” thousand databases; we’re looking at Michael Swart’s 10 Percent Rule in the rearview mirror. I regularly have to look for a particular value across the same table in a few dozen/hundred/thousand databases, or pull a report across as many databases, or run the same data change for a big list of databases. Most often, I’ll be given a list of databases or be asked “run this for all the databases that meet these criteria.” And the only way to do that easily is via a cursor because I have to first collect the list of databases from another table. There are 3rd party tools I could use but doing the setup to run against an arbitrary list of databases is tedious, error-prone, and I haven’t quite worked out a way to improve it yet.

Processing a table or result set RBAR is a performance concern. But to crank through a long list of databases and execute the same query against each it’s the only way to go, as far as I know. sp_msforeachdb doesn’t cut it for my purposes because I don’t want to hit every database on my instance.

My second piece of code is more of a technique or design pattern. In a stored procedure or large script with dynamic SQL, I’ll often create two variables (they’re parameters, in the case of stored procedures - choose a sensible default!) called @Debug and @ImSure. They’re just bit types but I use them to control the output of debgging information and code execution.

By doing this, I don’t have to comment/uncomment sections of code all the time just to see what dynamic SQL I’m generating. I also have a failsafe which prevents changes from being executed until I’ve made sure that everything is solid.

Those are probably the two pieces of code that I can share which I couldn’t be without. Honorable mentions which I didn’t write but find indispensable:

  • QUOTENAME - pretty basic T-SQL function but with all that dynamic SQL I’m writing, I need it to keep my SQL clean and safe.
  • dbatools - I’ve written about this PowerShell module quite a bit here but suffice to say for doing bulk administrative tasks, collecting metadata about the environment for analysis, and moving databases or entire instances around, it’s a lifesaver.
  • Brent Ozar Unlimited’s First Responder Kit - I run sp_blitzcache & sp_blitzindex daily looking for places we can tweak our code or indexes to make things run better.
  • Adam Machanic’s sp_whoisactive - Gives me a great lightweight snapshot of what’s going on right now.

Thanks for joining me on this T-SQL Tuesday!