Tidier Powershell Scripts with Default Parameter Values

I was recently working on a PowerShell script to set up some new databases for my users and found myself writing the same things over and over again.

Invoke-DbaQuery -SqlInstance MyServer -Database $SrcDB -Query "select field1 from table...";
Invoke-DbaQuery -SqlInstance MyServer -Database $DstDB -Query "update table...";
Invoke-DbaQuery -SqlInstance MyServer -Database $DstDB -Query "insert into table...";

By the 4th Invoke-DbaQuery, I found myself thinking “this repetitive typing kind of sucks.” Then I remembered Chrissy LeMaire’s segment in the first PSPowerHour where she talked about default values, and her accompanying dbatools blog post. Most of the blog posts and demos of this feature focus on using it from the command line, so I had overlooked the fact that I could use it from within a script as well, and even change the values when looping.

As it turns out – it works inside scripts and functions as well, and can make them a lot easier to read. And you’re not limited to the default parameters every time you call a given function; you can override the defaults by specifying the parameters when you call it.

$PSDefaultParameters is a hash, and the name/value pairs take the format of 'FunctionName:ParameterName' = ParameterValue.

Example

I set the destination database as the default for Invoke-DbaQuery because there will be a number of additional queries I need to execute against that database. In Step 1, I’m using the parameter defaults for both Databaseand SqlInstance, so I don’t need to specify them when calling the function. In Step 2, I need to override only Database, so I’m just specifying that one parameter.

Word of warning: Because $PSDefaultParameters is a hash, you might want to “build up” defaults over the course of a function, script or session. In that case, get in the habit of appending to the hash so that you don’t overwrite the defaults you’ve previously set.

If you’re already splatting parameters, this is a feature you can take advantage of to take the next step in tidying up your scripts and making them easier to read & manage.

A Monumental Migration to SQL Server 2016 – Part 1

A bit over a year ago, I blogged about my experience migrating a test SQL Server instance from a VM to a physical machine with a little help from my friends. That migration went well and the instance has been running trouble-free ever since. But it’s small potatoes. A modest instance, it’s only about 5% the size of production. With SQL Server 2008R2’s EOL looming, it was time to migrate production to SQL Server 2016.

Continue reading “A Monumental Migration to SQL Server 2016 – Part 1”

Thread-safe PowerShell Logging with PSFramework

In my previous post, I mentioned that I wasn’t checking the status of my RSJobs because I had logging built into my function. I originally tried to log out to plain text files with Out-File, but quickly hit a wall. In a multi-threaded script, attempting to write to the same text file from multiple threads results in collisions and waits at best, and outright failure at worse. Perplexed, I turned to the PowerShellHelp channel in the SQL Community Slack and asked for suggestions for thread-safe logging to text files.

Continue reading “Thread-safe PowerShell Logging with PSFramework”

PowerShell Multithreading with PoshRSJob

Intro

PowerShell has had a native method for spawning multiple “threads” ever since I can remember, in the form of the *-Job functions. They work OK, but there are a couple downsides:

  • Each job is its own PowerShell process, so it takes a non-trivial amount of time and memory to spin each up
  • There’s no built-in method for throttling the number of concurrent jobs

This combination will become an ugly mess if something spins out of control and you spawn dozens or hundreds of jobs. PowerShell jobs are better tailored to small-scale asynchronous background processing. You can wrap the functions to limit the number of concurrent jobs but again, there’s a lot of overhead involved in creating and tearing down jobs. Warren Frame created Invoke-Parallel, which uses runspaces (lighter-weight than jobs) and allows for throttling, but isn’t quite as full-featured as jobs are.

Enter PoshRSJob

Boe Prox comes to the rescue with his PoshRSJob module. This module provides runspace-based functions that mirror PowerShell’s native job functions to create – Get, Receive, Remove, Start, Stop, and Wait – plus some additional functionality like the ability to import modules (without needing Import-Module inside the Scriptblock) and throttling the number of concurrent jobs running.

Code

Explanation

The above is a fairly trivial script but it serves the purpose of demonstrating multi-threading with this module.

  1. In a loop, I’m creating two dozen RSJobs to fetch the PoshRSJob module from GitHub and write it out to a directory on my computer.
  2. I’ve set a throttle limit of 3; while all 24 jobs will be created, only three will run at any given time and the remainder will be in a NotStarted state, patiently waiting their turn.
  3. I’m passing value of the loop counter and my output directory into the job’s scriptblock so that I can write out 24 individually-named files.
  4. So that the RSJob system can keep track of the jobs properly, I’m lumping them into a named Batch.
  5. Because the module’s zip file is so small, I introduced a bit of a delay into each job. Why?
    1. To give me time to capture screenshots while the jobs are in various states.
    2. To prevent tripping GitHub’s anti-flooding protections.

So let’s fire this up!

Jobs created, but only three running

I’ve started up the script and it’s created all the RSJobs, but only the first three are running because that’s where I set my throttle limit. Let’s check in on the progress with Get-RSJob – all the same concepts you’re used to with regular PowerShell jobs (receiving, removing, checking for errors, etc.) work with RSJobs too!

And we’re done!

You still need to clean up after yourself! Don’t forget to Receive-RSJob and then Remove-RSJob. I have something special for that.

Bonus Round!

Let’s say you’ve got a lot of RSJobs. Or maybe long-running RSJobs. Wouldn’t a nice, user-friendly view of the progress be handy? You bet it would!

This is both dense and verbose, but I wrote it really quickly and it works (but not just for this post, I’ve used it in production), so I’m not tinkering too much with it right now. Once the jobs are created (and a few have started), this script runs a loop watching the output of Get-RSJob and updates a progress bar to reflect where we are. It also takes care of receiving and removing the completed jobs.

It’s true that I’m not checking for any errors in my job output. In the production process that I multithreaded, the Scriptblock inside each thread does its own error logging to a file.

Double Bonus

Did you notice the paths I’m writing to and the terminal theme? Yeah, this works on PowerShell Core too. Everything you see here ran on macOS as I was writing this post.

Conclusion

For light to moderate use, PoshRSJob is a really handy way to multi-thread processes that aren’t natively multi-threaded in PowerShell. It doesn’t seem to scale well to hundreds of threads, though. There is an open issue that seems to be related on GitHub, but I’ll discuss my own experience with it on a near-future post.

Copying Individual Tables with dbatools

@SQLMonkeyNYC asked on Twitter this morning:

Pat Phelan replied, suggesting that dbatools can do it, but after thinking on it for a bit and poking at a few functions, I realized that it’s not possible with a single function (yet). No worries. We can do it in three lines for now.

Explanation

  1. Grab the table from the source database and export the create script. I had to use -PassThru because otherwise, Export-DbaScript will create a file. Not the worst thing, but writing the file and the reading it immediately afterwards is a bit messy. Although now that I write that out, I suppose I could capture the filename that’s output and pass that to -QueryFile in the next step. Gotta love PowerShell – there’s always a few ways to do things.
  2. Take the CREATE TABLE script that we just exported from the source database and run it against the destination.
  3. Run Copy-DbaDbTableData to bring the data over from the source to the destination.

This is a really quick & dirty example where I’m copying between two databases on the same instance, but should give you the idea. You can copy across instances and copy multiple tables as well.

Update

Cláudio Silva (blog | twitter) has written a post that expands on the above code to include scripting out other objects related to the table being copied. The code above overlooks keys, constraints and more. Check it out!

PSPowerHour v1.0 Wrap-Up

The first edition of the PSPowerHour is in the books and it looks like it was a big success. This one was dbatools-heavy but I chalk that up to the dbatools community having lots of free time because we’ve automated so many of our tasks 🙂

Overall Impressions

I signed in about half an hour ahead of the webcast and was the first one there. Shortly thereafter, I was joined by Michael Lombardi (t, then Jess Pomfret (b|t) and Chrissy LeMaire (b|t). After ironing out a few glitches, we got everyone in the right place and kicked off the broadcast. Everything ran very smoothly, especially considering the number of people involved – Michael and Warren F. (b|t) did a terrific job of orchestrating everything.

While watching and listening to Chrissy, Doug, Andrew & Jess give their demos, I ran through my own in my head a couple times, adding and rearranging a few things as I observed how they were doing theirs. The big dilemma for me was whether or not to run the camera or exclusively screen share (I ended up going with the screen share only). Having not rehearsed my demo enough in the weeks leading up to the event, I was still not sure where to dip into more detail or dial things back and seeing what others were doing helped quite a bit. Having familiar faces & voices ahead of me in the queue put my nerves to rest.

I wasn’t able to watch the sessions after mine in their entirety due to family commitments. Joshua’s Burnt Toast module looks like it’ll be fun to experiment with and add some nice functionality to scripts (I got to see about half of his demo), and I’m really looking forward to catching a replay of Daniel’s demo of PowerShell on the Raspberry Pi – I didn’t realize that it had been ported already!

My Demo

I demoed Invoke-DbaSqlQuery and why one should use it over Invoke-SQLCmd – primarily for protection from SQL injection. Things didn’t go exactly the way I’d practiced; I ran short of time despite feeling like I rushed things and cutting back on some of what I had planned to say. The latter was in part because of the lead-ins from Chrissy, Andrew, and Jess. Because they did such a good job introducing dbatools, I was able to skip over it. But I was able to throw in a teaser for Matt Cushing’s (b|t) demo at the next PSPowerHour.

Running the demos inside a VM and screen-sharing just that VM made things easier for me as opposed to flipping between apps. My scripts will be available on GitHub along with the other presenters’ once the pull request is approved.

I achieved my goals:

  1. I did it
  2. I successfully demonstrated a SQL injection problem and explained why it’s so bad
  3. I demonstrated how to make database queries from PowerShell both more reliable and safer
  4. I learned about some new stuff that I desperately want to experiment with.

Next time around, I definitely need to rehearse more and get my timing down better but overall, I’m happy.

Check it out!

Speaking: PowerHour, August 21st 2018

It’s official! I will be speaking at the inaugural PowerHour online lightning demo event on Tuesday, August 21st at 2200 UTC. I’ll be demoing Better, Safer SQL Queries from PowerShell.

If you’re working with SQL Server from PowerShell, either as a DBA, analyst, or anyone else running queries, you’ve probably used Invoke-SqlCmd. But depending on how you’re building your queries, this can be error-prone or a huge security exposure! With the help of the dbatools module, I’ll show you how to write and run these queries better and safer – and make them easier to work into your scripts to boot.

I’m excited to be a part of this – it’s been far too long since I’ve done a presentation. Please join us on the YouTube channel/stream next Tuesday!

PowerHour – Online PowerShell Lightning Talks!

Earlier this week, the PowerHour was announced. What is it? It’s kind of like a virtual user group. One hour, 6(ish) lightning demos (10 minutes or less), centered on PowerShell. All community-sourced and driven – anyone can submit a proposal for a demo and if accepted, you’ll be slotted into an available spot.

They’ve already set up a YouTube Channel so you can either watch live or catch up later on, and the whole deal is being organized and managed through GitHub. Got something you want to show off? Log an issue using the template!

I’ve submitted my first (but hopefully not last) proposal. Yep, it involves dbatools because that’s my jam.

It’ll be fun for speakers and attendees alike! You can even use your demo(s) for user group meetings or SQL Saturdays – anywhere lightning talk/demo spots are available. Several SQL Server community folks have tossed proposals in and with so many DBAs getting hooked on PowerShell, it’s a great way for these two communities to come together.

Quick Table Info Export with PowerShell

This week I had a user come to me asking about how fields were defined on a few tables he was using in writing some reports. Long story short, he’s been tasked with writing some new reports and updating existing ones, but he doesn’t have visibility to the database itself so he’s left with the “ok, let’s try this” approach and then reading error messages to debug when things go sideways. Very tedious.

I asked him for a list of the tables he was most interested in, and while he worked on that I set to work on (of course) a quick & dirty PowerShell script to collect the data he needed – field names, types, and whether they’re nullable.

With the help of dbatools and Doug Finke’s ImportExcel module, the resulting script to retrieve the requisite data and export it to an Excel workbook with one sheet per table is very short. Each table is on its own worksheet, with the header row bold and frozen so it stays in view while scrolling.

Line by line:

  • Fetch the list of tables from a file provided to me
  • Gather a collection of SMO objects for the tables from the server
  • Loop over each table and extracting the relevant data about each
  • Write the table’s information out to a worksheet named for the table in an Excel workbook

The only tricky part was fetching the field lengths for the nvarchar fields. I had to go spelunking into each field’s Properties property to get that. The result:

Export-TableInfo Results

But I can tell this isn’t disco enough for you, so I’m gonna put it right here.
– Justin Hammer, Iron Man 2

You can do this in a one-liner too. It’s not as pretty, but it’s quite doable and the resulting Excel file is identical.

End to end, I turned this around in about 20 minutes and the the requestor’s life got easier in a big hurry. Best of all, when he comes back with another batch of tables for his next set of reports, I can turn the request around in less than five minutes.