Using Powershell to Dump a MSSQL Schema

The need

I recently needed to place several Microsoft SQL Server databases under Git version control. My requirements were:

  1. It needed to go inside a subdirectory so manual items could also be added.
  2. It needed to export tables, stored procedures, triggers, users, and more if possible.
  3. It needed to support multiple users, dumping to a local directory, where they could manage their own commits. Our team is in a larger environment where other teams make changes, and they will not be using version control.
  4. It needed to be convenient.

After a lot of searching, and trying out various SSMS plugins, I settled on creating a set of powershell scripts to backup either individual databases, entire servers, or everything at once.

The function

I began with the script written by Scripting Guy Ed Wilson with some reference to this Gist by Cheyne Wallace and ended up with this function:

#script_db_objects.ps1
function global:Script-DBObjectsIntoFolders([string]$server, [string]$dbname){

    [System.Reflection.Assembly]::LoadWithPartialName(“Microsoft.SqlServer.SMO”) | out-null

    "Connecting to server: " + $($server) + ", database: " + $($dbname)
    $SMOserver = New-Object (‘Microsoft.SqlServer.Management.Smo.Server’) -argumentlist $server
    $db = $SMOserver.databases[$dbname]

    $Objects = $db.Tables
    $Objects += $db.Triggers
    $Objects += $db.Views
    $Objects += $db.StoredProcedures
    $Objects += $db.UserDefinedFunctions
    $Objects += $db.Roles
    $Objects += $db.Users
    #$Objects += $db.Schemas # has not worked so far

    #Build this portion of the directory structure out here in case scripting takes more than one minute.
    $SavePath = “..\” + $($server) + "\" + $($dbname)
    if ((Test-Path -Path $SavePath) -eq "true") {} else {
        new-item -type directory -path "$SavePath" | out-null
    }
    $SavePath = Resolve-Path $($SavePath)
    "Saving to: " + $SavePath

    foreach ($ScriptThis in $Objects | where {!($_.IsSystemObject)}) {

        #Need to Add Some mkDirs for the different $Fldr=$ScriptThis.GetType().Name
        $scriptr = new-object (‘Microsoft.SqlServer.Management.Smo.Scripter’) ($SMOserver)
        $scriptr.Options.AppendToFile = $True
        $scriptr.Options.AllowSystemObjects = $False
        $scriptr.Options.ClusteredIndexes = $True
        $scriptr.Options.DriAll = $True
        $scriptr.Options.ScriptDrops = $False
        $scriptr.Options.IncludeHeaders = $False
        $scriptr.Options.ToFileOnly = $True
        $scriptr.Options.Indexes = $True
        $scriptr.Options.Permissions = $True
        $scriptr.Options.WithDependencies = $False
        <#Script the Drop too#>
        $ScriptDrop = new-object (‘Microsoft.SqlServer.Management.Smo.Scripter’) ($SMOserver)
        $ScriptDrop.Options.AppendToFile = $True
        $ScriptDrop.Options.AllowSystemObjects = $False
        $ScriptDrop.Options.ClusteredIndexes = $True
        $ScriptDrop.Options.DriAll = $True
        $ScriptDrop.Options.ScriptDrops = $True
        $ScriptDrop.Options.IncludeHeaders = $False
        $ScriptDrop.Options.ToFileOnly = $True
        $ScriptDrop.Options.Indexes = $True
        $ScriptDrop.Options.WithDependencies = $False
        <#This section builds folder structures.  Remove the date folder if you want to overwrite#>
        $TypeFolder=$ScriptThis.GetType().Name

        # translate type folder
        switch ($TypeFolder) {
            "DatabaseRole"        { $TypeFolder = "Roles" }
            "StoredProcedure"     { $TypeFolder = "Procedures" }
            "Table"               { $TypeFolder = "Tables" }
            "Trigger"             { $TypeFolder = "Triggers" }
            "User"                { $TypeFolder = "Users" }
            "UserDefinedFunction" { $TypeFolder = "Functions" }
            "View"                { $TypeFolder = "Views" }
        }

        if ((Test-Path -Path “$SavePath\$TypeFolder”) -eq “true”) {“Scripting Out $TypeFolder $ScriptThis”} 
        else {new-item -type directory -name “$TypeFolder” -path “$SavePath” | out-null}

        $ScriptFile = $ScriptThis -replace “\[|\]” #no brackets
        $ScriptFile = $ScriptFile -replace " ", "_" #no spaces
        $ScriptFile = $ScriptFile -replace "\\", "@" #no backslashes (useful for domain users)

        $folderPath = Split-Path "$SavePath\$TypeFolder\$ScriptFile.sql” -parent
        if ((Test-Path -Path $folderPath) -eq "true") {}
        else {new-item -type directory -path "$folderPath" | out-null}

        $filepath = “$SavePath\$TypeFolder\$ScriptFile.sql”

        if ((Test-Path -Path “$filepath”) -eq “true”) { Remove-Item “$filepath” }

        $ScriptDrop.Options.FileName = “$filepath.utf16”
        $scriptr.Options.FileName = “$filepath.utf16”

        #This is where each object actually gets scripted one at a time.
        $ScriptDrop.Script($ScriptThis) #first script a drop statement
        $scriptr.Script($ScriptThis)    #then dump the structure

        #MSSQL outputs UTF16, which git will see as binary. This converts it to UTF8 so diff will work.
        Get-Content -en Unicode "$filepath.utf16" | Out-File -en utf8 "$filepath"

        Remove-Item "$filepath.utf16"

    } #end $Objects loop

} #end Script-DBObjectsIntoFolders function

This outputs tables, views, stored procedures, user defined functions, roles, and users, dumping them as .sql files in subfolders. There is nothing particularly special about this function. It creates directories, does some string substitutions, and dumps the objects. There is only one gotcha, which I will cover next.

Encoding: UTF-16 and Git

It turns out MSSQL exports .sql in UTF-16 little endian. If these files are in a git repo, the problem is git will often see UTF-16 files as binary because of the characters it finds.

It turns out we can use the Get-Content and Out-File commands to convert these files to UTF-8, which git understands. We create the initial file with a .utf16 suffix, translate it to the .sql file, and remove the .utf16 file:

Get-Content -en Unicode "$filepath.utf16" | Out-File -en utf8 "$filepath"

Remove-Item "$filepath.utf16"  

Calling All Functions

Now we need to use this function in our scripts. For individual databases, we start the name with the server, followed by the database name, for reasons we will soon see:

#file: elsbia_elgin-production.ps1
. ".\script_db_objects.ps1"

Script-DBObjectsIntoFolders "elsbia" "Elgin-Production"  

Since we have started all file names for a given server with its name, we can also write a simple script to backup that server. We start this scripts name with an underscore, so it sorts to the top, and so we can again use it later:

#file: _elsbia.ps1
Get-ChildItem elsbia_*ps1 | foreach {  
    "###"
    "Executing: " + $_
    "###"
    ""
    . $_
    "###"
    ""
    ""
}

The last convenience script backs up every server and every database by executing every script beginning with an underscore. It begins with a period so it sorts to the top of the list:

#file: .all_servers.ps1
Get-ChildItem elsbia_*ps1 | foreach {  
    ""
    "***"
    "Executing Server: " + $_
    "***"
    ""
    . $_
    ""
    "***"
    ""
}

Conclusion

With a few tricks and conventions we now have some simple scripts that anyone can run inside the repository, updating their files from the server. This definitely has drawbacks in terms of controlling who does what (there is no checkout!), and it does not discourage editing files directly on the server. However, when you do not have control over the server, it is definitely a step in the right direction to have version control for reverting mistakes or viewing past code.

comments powered by Disqus