SQL New Blogger Challenge Digest – Week 4

This week marks the end of Ed Leighton-Dick’s New Blogger Challenge. It’s terrific seeing everyone sticking with the challenge all month and I’m looking forward to catching up with all the posts. Great job, everyone! Keep going!

AuthorPost
@MtnDBA#SQLNewBlogger Week 4 – My 1st SQLSaturday session | DBA With Altitude
@Lance_LT“MongoDB is the WORST!” | Lance Tidwell the Silent DBA
@ceedubveeA Insider’s View of the Autism Spectrum: Autism and Information Technology: Big Data for Diagnosis
@JorrissA Podcast Is Born
@toddkleinhansA Tale of SQL Server Disk Space Trials and Tribulations | toddkleinhans.com
@arrowdriveAnders On SQL: First “real” job with SQL.
@arrowdriveAnders On SQL: Stupid Stuff I have done. 2/?. Sometimes even a dev server is not a good dev environment
@way0utwestApril Blogger Challenge 4–Filtered Index Limitations | Voice of the DBA
@ALevyInROCAre You Backing Everything Up? | The Rest is Just Code
@DesertIsleSQLAzure Data Lake: Why you might want one |
@EdDebugBIML is better even for simple packages | the.agilesql.club
@tpet1433Corruption – The Denmark of SQL Instances – Tim Peters
@eleightondickCreating a Self-Contained Multi-Subnet Test Environment, Part II – Adding a Domain Controller | The Data Files
@MattBatalonCreating an Azure SQL Database | Matt Batalon
@pshore73Database on the Move – Part I | Shore SQL
@pmpjrDo you wanna build a cluster?! | I have no idea what I’m doing
@DwainCSQLExcel in T-SQL Part 1 – HARMEAN, GEOMEAN and FREQUENCY | dwaincsql
@AalamRangiGotcha – SSIS ImportExport Wizard Can Kill Your Diagrams | SQL Erudition
@toddkleinhansHow Do Blind People Use SQL Server? | toddkleinhans.com
@DBAFromTheColdIn-Memory OLTP: Part 4 – Native Compilation | The DBA Who Came In From The Cold
@AaronBertrandIt’s a Harsh Reality – Listen Up – SQL Sentry Team Blog
@GuruArthurLooking back at April – Arthur Baan
@nocentinoMoving SQL Server data between filegroups – Part 2 – The implementation – Centino Systems Blog
@MyHumbleSQLTipsMy Humble SQL Tips: Tracking Query Plan Changes
@m82labsReduce SQL Agent Job Overlaps · m82labs
@fade2blackukRob Sewell on Twitter: “Instances and Ports with PowerShell http://t.co/kwN2KwVDOS”
@DwainCSQLRuminations on Writing Great T-SQL | dwaincsql
@sqlsanctumSecurity of PWDCOMPARE and SQL Hashing | SQL Sanctum
@PittfurgSQL Server Backup and Restores with PowerShell Part 1: Setting up – Port 1433
@cjsommerUsing PowerShell to Export SQL Data to CSV. How well does it perform? | cjsommer.com
@gorandalfUsing SSIS Lookup Transformation in ETL Packages | Gorandalf’s SQL Blog
@nicharshWords on Words: 5 Books That Will Improve Your Writing

Are You Backing Everything Up?

We hear the common refrain among DBAs all the time. Back up your data! Test your restores! If you can’t restore the backup, it’s worthless. And yes, absolutely, you have to back up your databases – your job, and the company, depend upon it.

But are you backing everything up?

Saturday night was an ordinary night. It was getting late, and I was about to put my computer to sleep so I could do likewise. Suddenly, everything on my screen was replaced with a very nice message telling me that something had gone wrong and my computer needed to be restarted.

Uh oh.

In 7 1/2 years of using OS X, I’ve had something like this happen maybe 4 times.

After waiting whet felt like an eternity, the system finished booting & I got back into my applications. I opened up PowerPoint, as I had it open before the crash so I could work on my SQL Saturday Rochester slide deck whenever inspiration struck. I opened my file, and was greeted by nothingness. I flipped over to Finder and saw zero bytes displayed as the file size.

Uh oh.

“But Andy,” you say, “you use CrashPlan, right? Can’t you just recover the file from there?” Well, you’re half right. I do use CrashPlan. I even have a local, external hard drive (two, actually) that I back up to in addition to CrashPlan’s cloud service. But I couldn’t recover from any of those.

CrashPlan configuration - oops

Because Dropbox is already “in the cloud”, I had opted to not back it up with CrashPlan when I first set it up. After all, it’s already a backup right? It’s not my only copy, it’s offsite, it’s all good.

Not so fast. When my system came back up, Dropbox dutifully synced everything that had changed – including my now-empty file.

Dropbox - 0 bytes

So, now what? Fortunately, Dropbox allows you to revert to older versions, and I was able to select my last good version and restore it.

Screenshot 2015-04-26 21.04.48

Lessons Learned

I broke The Computer Backup Rule of Three and very nearly regretted it. For my presentation:

  • I had copies in two different formats – Dropbox & my local (internal) hard drive
  • I had one copy offsite (Dropbox)
  • I only had two copies, not three (local and Dropbox).

Even scarier, if Dropbox didn’t have a version history or it had taken me more than 30 days to realize that this file had been truncated, I’d have lost it completely.

Everything else on my computer was in compliance with the Rule Of Three; I just got lazy with the data in my Dropbox and Google Drive folders. I’ve since updated my CrashPlan settings to include my local Dropbox and Google Drive folders so that my presentation should now be fully protected:

  • Five copies
    • Local drive
    • Two external drives w/ CrashPlan
    • CrashPlan cloud service
    • Dropbox/Google Drive (different content in each)
  • Three formats
    • Spinning platters in my possession
    • Dropbox/Google Drive
    • Crashplan
  • Two copies offsite
    • CrashPlan cloud
    • Dropbox/Google Drive

And don’t forget to test those backups before you need to use them. Dropbox, Google Drive and other online file storage/sync solutions are very useful, but you cannot rely upon them as backups. I don’t think you’ll ever regret having “extra” backups of your data, as long as that process is automatic.

SQL New Blogger Digest – Week 3

Here are the posts collected from week three of the SQL New Blogger Challenge. It’s been compiled the same way previous weeks’ posts were. Everyone’s doing a great job keeping up with the challenge!

AuthorPost
@MtnDBA#SQLNewBlogger Week 3 – PowerShell Aliases | DBA With Altitude
@ceedubveeA Insider’s View of the Autism Spectrum: Autism and Information Technology: New Efforts for Kids to Code
@arrowdriveAnders On SQL: Stupid Stuff I have done. 2/?. Sometimes even a dev server is not a good dev environment
@way0utwestApril Blogger Challenge 3 – Filtered Indexes | Voice of the DBA
@eleightondickCreating a Self-Contained Multi-Subnet Test Environment, Part I – Networking | The Data Files
@ceedubveeEmpower Individuals With Autism Through Coding | Indiegogo
@MattBatalonEXCEPT and INTERSECT… | Matt Batalon
@cjsommerFollow the yellow brick what? My road to public speaking. | cjsommer.com
@DBAFromTheColdIn-Memory OLTP: Part 3 – Checkpoints | The DBA Who Came In From The Cold
@MattBatalonIntroduction to Windowing Functions | Matt Batalon
@nocentinoMoving SQL Server data between filegroups – Part 1 – Database Structures – Centino Systems Blog
@Lance_LTMy first year as a speaker | Lance Tidwell the Silent DBA
@MyHumbleSQLTipsMy Humble SQL Tips: Tracking Page Splits
@ALevyInROCPadding Fields for Fixed-Position Data Formats
@tpet1433Sir-Auto-Completes-A-Lot a.k.a. how to break IntelliSense, SQL Prompt and SQL Complete – Tim Peters
@pmpjrstats, yeah stats. | I have no idea what I’m doing
@DwainCSQLStupid T-SQL Tricks – Part 3: A Zodiacal SQL | dwaincsql
@cathrinewTable Partitioning in SQL Server – Partition Switching – Cathrine Wilhelmsen
@gorandalfThe MERGE Statement – One Statement for INSERT, UPDATE and DELETE | Gorandalf’s SQL Blog
@SQLJudoThe Road to SQL Server 2014 MCSE | Russ Thomas – SQL Judo
@GGreggBT-SQL Tuesday #65: FMT_ONLY Replacements | Ken Wilson
@AalamRangiWhat is the RetainSameConnection Property of OLEDB Connection in SSIS? | SQL Erudition
@EdDebugWhat Permissions do I need to generate a deploy script with SSDT? | the.agilesql.club
@_KenWilsonWindowing using OFFSET-FETCH | Ken Wilson
@DesertIsleSQLWhat Does Analytics Mean?
@DesertIsleSQLAzure ML, SSIS and the Modern Data Warehouse
@DesertIsleSQLMusing about Microsoft’s Acquisition of DataZen and Power BI
@GuruArthurCheck for database files not in default location

Padding Fields for Fixed-Position Data Formats

Fixed-position data formats will seemingly be with us forever. Despite the relative ease of parsing CSV (or other delimited formats), or even XML, many data exchanges require a fixed-position input. Characters 1-10 are X, characters 11-15 are Y and if the source data is fewer than 5 characters, we have to left-pad with a filler character, etc. When you’re accustomed to working with data that says what it means and means what it says, having to add “extra fluff” like left-padding your integers with a half-dozen zeroes can be a hassle.

I received a draft of a stored procedure recently which had to do exactly this. The intent is for the procedure to output the data almost entirely formatted as required, one record per line in the output file, and dump the result set to a file on disk. As it was given to me, the procedure was peppered with CASE statements like this (only more complex) in the SELECT clause:

[code language=”sql”]
— Method 1
select case len(cast(logid as varchar))
when 9 then ‘0’ + cast(logid as varchar)
when 8 then ’00’ + cast(logid as varchar)
when 7 then ‘000’ + cast(logid as varchar)
when 6 then ‘0000’ + cast(logid as varchar)
when 5 then ‘00000’ + cast(logid as varchar)
when 4 then ‘000000’ + cast(logid as varchar)
when 3 then ‘0000000’ + cast(logid as varchar)
when 2 then ‘00000000’ + cast(logid as varchar)
when 1 then ‘000000000’ + cast(logid as varchar)
when 0 then ‘0000000000’ + cast(logid as varchar)
end as logid
,logtext from cachedb.dbo.logs;
[/code]

It’s perfectly valid, it works, and there’s nothing inherently wrong with it. But I find it a bit tough to read, and it could become trouble if the format changes later, as additional (or fewer) cases will have to be accounted for. Fortunately, the day I received this procedure was right around the day I learned about the REPLICATE() T-SQL function. Maybe we can make this simpler:

[code language=”sql”]
select replicate(‘0’,10-len(cast(logid as varchar))) + cast(logid as varchar) as logid,logtext from cachedb.dbo.logs;
[/code]

Not bad. But it leaves us with a magic number and similar to the previous example, if the file format changes we have to seek out these magic numbers and fix them. This is easily remedied by defining these field lengths at the beginning of the procedure, so that they’re all in one place if anything needs to change.

[code language=”sql”]
— Method 2
declare @paddedlength int = 10;
select replicate(‘0’,@paddedlength-len(cast(logid as varchar))) + cast(logid as varchar) as logid,logtext from cachedb.dbo.logs;
[/code]

Yet another approach would be to pad out the value beyond what we need, then trim the resulting string back to the required length. Again, we have to be careful to not leave ourselves with magic numbers; the solution is the same as when using REPLICATE():

[code language=”sql”]
— Method 3
select right(‘0000000000’ + cast(logid as varchar), 10) as logid,logtext from cachedb.dbo.logs;
— Or, with more flexibility/fewer magic numbers
— Method 4
declare @paddedlength int = 10;
select right(replicate(‘0’,@paddedlength) + cast(logid as varchar), @paddedlength) as logid,logtext from cachedb.dbo.logs;
[/code]

All four methods yield the same results, as far as the data itself is concerned. But what about performance? For a table with 523,732 records, execution times were:

  1. 2,000ms CPU time, 261,785ms elapsed
  2. 2,265ms CPU time, 294,399ms elapsed
  3. 2,000ms CPU time, 297,593ms elapsed
  4. 2,078ms CPU time, 302,045ms elapsed

Each method had an identical execution plan, so I’m probably going to opt for the code that’s more readable and maintainable – method 2 or 4.

As with any tuning, be sure to test with your own data & queries.

SQL New Blogger Digest – Week 2

I didn’t intend for last week’s digest to also be my post for week two of the challenge, but life got in the way and I wasn’t able to complete the post that I really wanted in time. So, that post will be written much earlier in week three and completed well ahead of the deadline.

Here are the posts collected from week two of the SQL New Blogger Challenge. It’s been compiled the same way last week’s was.

AuthorPost
@AaronBertrand#SQLNewBlogger Roundup – SQL Sentry Team Blog
@MtnDBA#SQLNewBlogger Week 2 – Teach Something New | DBA With Altitude
@ceedubveeA Insider’s View of the Autism Spectrum: Autism and Information Technology: Back on the Job Hunt
@DwainCSQLAn Easter SQL | dwaincsql
@DwainCSQLAn Even Faster Method of Calculating the Median on a Partitioned Heap | dwaincsql
@arrowdriveAnders On SQL: Stupid stuff I have done. 1/? Or, How I learned to stop GUIing and love the script
@MattBatalonAnother TRUNCATE vs. DELETE tidbit… | Matt Batalon
@way0utwestApril Blogging Challenge 2 – Primary Key in CREATE TABLE | Voice of the DBA
@GuruArthurArthur BaanSQL Server error 17310 – Arthur Baan
@PittfurgBlog Series: SQL Server Backup and Restores with PowerShell – Port 1433
@fade2blackukChecking SQL Server User Role Membership with PowerShell « SQL DBA with A Beard
@gorandalfDatabase Compatibility Level 101 | Gorandalf’s SQL Blog
@nocentinoDesigning for offloaded log backups in AlwaysOn Availability Groups – Monitoring – Centino Systems Blog
@SqlrUsDetaching a Database – File Security Gotcha | John Morehouse | sqlrus.com
@MartynJones76Devon DBA: Check Database Integrity Task Failed … Oh Dear Master Luke!
@toddkleinhansHow Do You Visualize Abstractions? | toddkleinhans.com
@AalamRangiHow to Have Standard Logging in SSIS and Avoid Traps | SQL Erudition
@gorandalfHow to Test Existing T-SQL Code Before Changing the Compatibility Level | Gorandalf’s SQL Blog
@EdDebugHOWTO-Get-T-SQL-Into-SSDT | the.agilesql.club
@DBAFromTheColdIn-Memory OLTP: Part 2 – Indexes | The DBA Who Came In From The Cold
@nicharshIt’s a Harsh Reality – SQL Sentry Team Blog
@SQLBekLearn Something New – SSMS Tips & Tricks « Every Byte Counts
@cjsommerModify SQL Agent Jobs using PowerShell and SMO | cjsommer.comcjsommer.com
@MyHumbleSQLTipsMy Humble SQL Tips: Full List of SQL Server 2014 DMVs
@MyHumbleSQLTipsMy Humble SQL Tips: Running DBCC CHECKDB on TEMPDB
@way0utwestNew Blogger Challenge 1 – Adding a Primary Key | Voice of the DBA
@uMa_Shankar075Querying Microsoft SQL Server: In Memory Optimized Table in SQL Server 2014
@JorrissRandom Thoughts of Jorriss
@pmpjrSidenote, the 4200 databases are a different story for another week… | I have no idea what I’m doing
@ALevyInROCSQL New Blogger Challenge Weekly Digest | The Rest is Just Code
@jh_randallSQL Server Monitoring – Getting it Right – SQL Sentry
@cathrinewTable Partitioning in SQL Server – The Basics – Cathrine Wilhelmsen
@eleightondickTeach Something New: PowerShell Providers [T-SQL Tuesday #065] | The Data Files
@rabrystThe Art of Improvisation – Born SQL
@DBAFromTheColdThe DBA Who Came In From The Cold | Advice on working as a SQL Server DBA
@Lance_LTThe estimated query plan and the plan cache (Part 2) | Lance Tidwell the Silent DBA
@SQLJudoTSQL Tue 65: Memory Optimized Hash Indexes | Russ Thomas – SQL Judo
@sqlsanctumT-SQL Tuesday #065 – Teach Something New – APPLY | SQL Sanctum
@_KenWilsonT-SQL Tuesday #65: FMT_ONLY Replacements | Ken Wilson
@m82labsUntangling Dynamic SQL · m82labs
@cathrinewUsing a Numbers Table in SQL Server to insert test data – Cathrine Wilhelmsen
@tpet1433Why yes I can do table level restores – Tim Peters
@JorrissWhy You Absolutely Need Alternate Keys: A Unique Constraint Story

SQL New Blogger Challenge Weekly Digest

Watching all of the tweets as people posted their first entries in the SQL New Blogger Challenge earlier this week, I quickly realized that keeping up was going to be a challenge of its own. Fortunately, there are ways to reign it in.

My first stop was IFTTT (If This Then That). IFTTT allows you to create simple “recipes” to watch for specific events/conditions, then perform an action. They have over 175 “channels” to choose from, each of which has one or more triggers (events) and actions. I have IFTTT linked to both my Google and Twitter accounts, which allowed me to create a recipe which watches Twitter for the #sqlnewblogger hashtag, and writes any tweets that match it to a spreadsheet on my Google Drive account (I’ll make the spreadsheet public for now, why not?).

The next step is to export the spreadsheet to CSV. I don’t have this automated, and may not be able to (I may have to find another workaround). Once it’s a CSV, I can go to PowerShell to parse my data. I want the end result to be an HTML table showing each post’s author (with a link to their Twitter stream) and a link to the post (using the title of the post itself).

Once I import the CSV file into an object in my PowerShell script, I need to do some filtering. I don’t want to be collecting all the retweets (posts starting with RT), and I should probably exclude any post that doesn’t contain a URL (looking for the string HTTP).

To extract the post URLs, I ran a regular expression against each tweet. Twitter uses their own URL shortener (of course), which makes this pretty easy – I know the hostname is t.co, and after the slash is an alphanumeric string. The regex to match this is fairly simple: [http|https]+://t.co/[a-zA-Z0-9]+

Then, for each URL found in the tweet, I use Invoke-WebRequest to fetch the page. This cmdlet automatically follows any HTTP redirects (I was afraid I’d have to do this myself), so the object returned is the real destination page. Invoke-WebRequest also returns the parsed HTML of the page (assuming you use the right HTTP method), so I can extract the title easily instead of having to parse the content myself. It’ll also give me the “final” URL (the destination reached after all the redirects). Easy!

My full script:

[code lang=”powershell”]
#requires -version 3
[cmdletbinding()]
param ()
set-strictmode -Version latest;
Add-Type -AssemblyName System.Web;
$AllTweets = import-csv -path ‘C:\Dropbox\MiscScripts\Sqlnewblogger tweets – Sheet1.csv’ | where-object {$_.text -notlike "RT *" -and $_.text -like "*http*"} | select-object -property "Tweet By",Text,Created | Sort-Object -property created -Unique;
$TweetLinks = @();
foreach ($Tweet in $AllTweets) {
$Tweet.text -match ‘([http|https]+://t.co/[a-zA-Z0-9]+)’ | out-null;
foreach ($URL in $Matches) {
$MyURL = $URL.Item(0);
# Invoke-WebRequest automatically follows HTTP Redirects. We can override this with -MaxRedirection 0 but in this case, we want it!
$URLCheck = Invoke-WebRequest -Method Get -Uri $MyUrl;
$OrigUrl = $URLCheck.BaseResponse.ResponseUri;
write-debug $Tweet.’Tweet By’;
Write-debug $URLCheck.ParsedHtml.title;
write-debug $URLCheck.BaseResponse.ResponseUri;
$TweetLinks += new-object -TypeName PSObject -Property @{"Author"=$Tweet.’Tweet By’;"Title"=$URLCheck.ParsedHtml.title;"URL"=$URLCheck.BaseResponse.ResponseUri;};
}
}
Write-debug $TweetLinks;
$TableOutput = "<table><thead><tr><td>Author</td><td>Post</td></tr></thead><tbody>";
foreach ($TweetLink in $TweetLinks) {
$TableOutput += "<tr><td><a href=""https://twitter.com/$($TweetLink.Author.replace(‘@’,”))"">$($TweetLink.Author)</a></td><td><a href=""$($TweetLink.URL)"">$([System.Web.HttpUtility]::HtmlEncode($TweetLink.Title))</a></td></tr>";
}
$TableOutput += "</tbody></table>";
$TableOutput;[/code]

And now, my digest of the first week of the SQL New Blogger Challenge. This is not a complete listing because I didn’t think to set up the IFTTT recipe until after things started. I also gave people the benefit of the doubt on the timing (accounting for timezones, etc.) and included a few posted in the early hours of April 8th. For week 2, it will be more complete.

AuthorPost
@eleightondickKevin Kline on Twitter: “Advice to New Bloggers http://t.co/o1jfLOR4QI

@BarbiducH
Safe exit from WHILE loop using ##global temp tables | One developer’s SQL blog
@eleightondickMike Donnelly on Twitter: “T-SQL Tuesday #065 – Teach Something New http://t.co/LoyFbhVOpw #tsql2sday”
@GuruArthurArthur BaanTrace flags in SQL Server – Arthur Baan
@cjsommerBlogging and Intellectual Property Law | legalzoom.com
@ceedubveeA Insider’s View of the Autism Spectrum: Autism and Information Technology: Answering a Blog Challenge (Plus, Why I Like Data)
@arrowdriveAnders On SQL: A bit about me continued. Anders meets SQL
@SQLJudoExperience Is Overated | Russ Thomas – SQL Judo
@SQLBekT-SQL Tuesday #065 – Teach Something New | Mike Donnelly, SQLMD
@MtnDBA#SQLNewBlogger Week 1 “Eye of the Tiger” | DBA With Altitude
@Lance_LTThe estimated query plan and the plan cache (Part 1) | Lance Tidwell the Silent DBA
@AalamRangiHow to Use Temp Table in SSIS | SQL Erudition
@DwainCSQLAn Easter SQL

Limitations

There are a couple limitations and gotchas with this process:

  • The IFTTT recipe only runs every 15 minutes (all IFTTT triggers run on 15 minute intervals) and only fetches 15 tweets each time it runs (again, IFTTT’s configuration). So if there’s a flood of tweets, they won’t all be captured.
  • I don’t really check the final destination of a link. For example, one of the first tweets captured contained a link to another tweet, which then linked to a blog post. Could I detect this & resolve the true final destination? Probably. But it’d involve a lot more logic, and it’s getting late.
  • I also don’t pick up every duplicate link/post. Again, I can probably get by this with some extra logic, but I don’t think it’s necessary right now.
  • It doesn’t post automatically to my blog, or anywhere else. I have to manually paste the HTML into my blog post(s).
  • I had to manually remove one link as it didn’t actually point to a post written for the challenge; it was tweeted with the hashtag, so my IFTTT recipe caught it.
  • I started collecting these tweets mid-day April 7th. If you posted before that, I’ve likely missed your post. You will be picked up for Week Two!

Connecting SQLite to SQL Server with PowerShell

This post is part of Ed Leighton-Dick’s SQL New Blogger Challenge. Please follow and support these new (or reborn) bloggers.

I’m working with a number of SQLite databases as extra data sources in addition to the SQL Server database I’m primarily using for a project. Brian Davis (b|t) wrote a blog post a few years ago that covers setting up the connection quite well. In my case, I’ve got nine SQLite databases to connect to, and that gets tedious. PowerShell to the rescue!

Continue reading “Connecting SQLite to SQL Server with PowerShell”

Rochester SQL Server User Group February Meeting – Slides & Demos

On Thursday, February 26th I presented “Easing Into Windows PowerShell” to a packed house at the Rochester SQL Server User Group meeting. Thanks to Matt Slocum (b | t) for being my semi-official photographer.

Me, presenting!
Presenting Easing Into Windows PowerShell at the Rochester SQL Server User Group February 26, 2015

We set a chapter attendance record! I had a lot of fun presenting this (my first time speaking outside my company) and we had some great conversations during and after the meeting.

I’ve posted my slides & demos for your enjoyment.

Rochester PASS Chapter February Meeting – I’m Speaking!

On Thursday, February 26th at 6:00 PM EST I will be speaking at the Rochester PASS chapter meeting. The topic is “Easing Into PowerShell – What’s It All About?“.

You’ve been hearing a lot about Windows PowerShell, but you’re wondering if it’s something you should be looking into. In this introductory session, we’ll talk about what PowerShell is, where it came from, how it works, and what it can do for you. Whether you’re a junior DBA or seasoned veteran, you’ll find something that PowerShell can help you do easier.

If you’re planning to attend, please let us know by RSVPing at Nextplex. Slides will be posted here the following day.

T-SQL Tuesday #61 – Giving Back

T-SQL Tuesday LogoWayne Sheffield (b|t) is hosting this month’s T-SQL Tuesday and his topic is Giving Back to the SQL Community. More specifically, he’s asking how each of us is planning on giving something back to the SQL Community in 2015. He offers up a few suggestions, so I’ll start by addressing those and then move on to additional ideas.

  • Are you going to start speaking at your local user group?
    Yes, I expect that by the end of 2015 I will have spoken to our local chapter at least once. I spoke to various groups at work in 2014 and plan to continue doing so in 2015 as well.
  • Perhaps step up and help run your local user group?
    I was named the Vice President of our local chapter a couple months ago, and I will continue in that capacity.
  • Do you want to start becoming an active blogger – or increase your blogging?
    Yes! At the time of this writing I’ve only published 7 posts here, and I have 6 others in various stages of preparation. I have some ideas brewing, I just need to get things written and then actually press that Publish button. Part of it is fear/insecurity, and I need to get out of my comfort zone a little and Just Do It.
  • Do you plan on volunteering your time with larger organizations (such as PASS), so that SQL Training can occur at a larger level?
    If I have the opportunity to attend PASS Summit in 2015, I will volunteer at the event. When the call for pre-event volunteers go out, I’ll look at what’s needed and try to step a little out of my comfort zone & do something there as well.
  • Other ways of contributing
    • For the 3rd year, I will be helping to organize and run SQL Saturday Rochester in 2015. If you’re reading this, you probably know about SQL Saturday, and have probably even been to one. Next time, bring a friend!
    • I’ve been promoting PASS and our local chapter for a while at work and will be a more vocal in 2015. There are a lot of people with knowledge and experience they can share who aren’t even aware that PASS and the local and virtual user groups exist. I want to help bring those people into the community.