If you are a SQL DBA you will have heard of Ola Hallengrens Maintenance solution If you haven’t go and click the link and look at the easiest way to ensure that all of your essential database maintenance is performed. You can also watch a video from Ola at SQL Bits
Recently I was thinking about how I could validate that this solution was installed in the way that I wanted it to be so I turned to Pester You can find a great how to get started here which will show you how to get Pester and how to get started with TDD.
This isn’t TDD though this is Environment Validation and this is how I went about creating my test.
First I thought about what I would look for in SSMS when I had installed the maintenance solution and made a list of the things that I would check which looked something like this. This would be the checklist you would create (or have already created) for yourself or a junior following this install. This is how easy you can turn that checklist into a Pester Test and remove the human element and open your install for automated testing
SQL Server Agent is running – Otherwise the jobs won’t run 🙂
We should have 4 backup jobs with a name of
DatabaseBackup – SYSTEM_DATABASES – FULL
DatabaseBackup – USER_DATABASES – FULL
DatabaseBackup – USER_DATABASES – DIFF
DatabaseBackup – USER_DATABASES – LOG
We should have Integrity Check and Index Optimisation Jobs
We should have the clean up jobs
All jobs should be scheduled
All jobs should be enabled
The jobs should have succeeded
I can certainly say that I have run through that check in my head and also written it down in an installation guide in the past. If I was being more careful I would have checked if there were the correct folders in the folder I was backing up to.
Ola’s script uses a default naming convention so this makes it easy. There should be a SERVERNAME or SERVERNAME$INSTANCENAME folder or if there is an Availability Group a CLUSTERNAME$AGNAME and in each of those a FULL DIFF and LOG folder which I can add to my checklist
So now we have our checklist we just need to turn in into a Pester Environmental Validation script
It would be useful to be able to pass in a number of instances so we will start with a foreach loop and then a Describe Block then split the server name and instance name, get the agent jobs and set the backup folder name
I also set the Agent service display name so I can get its status. I split the jobs up using a Context block, one each for Backups, Database maintenance and solution clean up but they all follow the same pattern. .First get the jobs
Then we can iterate through them and check them but first lets test the Agent Service. You do this with an It Block and in it put a single test like this
actual-value | Should Be expected-value
So to check the Agent Job is running we can do this
1
(Get-service -ComputerName $ServerName -DisplayName $DisplayName).Status | Should Be 'Running'
To find out how to get the right values for any test I check using get member so to see what is available for a job I gathered the Agent Jobs into a variable using the Get-SQLAgentJob command in the new sqlserver module (which you can get by installing the latest SSMS from here) and then explored their properties using Get-Member and the values using Select Object
then using a foreach to loop through them I can check that the jobs, exists, is enabled, has a schedule and succeeded last time it ran like this
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
$Jobs = $Jobs.Where{($_.Name -eq 'DatabaseIntegrityCheck - SYSTEM_DATABASES') -or ($_.Name -eq 'DatabaseIntegrityCheck - USER_DATABASES') -or ($_.Name -eq 'IndexOptimize - USER_DATABASES')}
foreach($job in $Jobs)
{
$JobName = $Job.Name
It '$JobName Job Exists'{
$Job | Should Not BeNullOrEmpty
}
It '$JobName Job is enabled' {
$job.IsEnabled | Should Be 'True'
}
It '$JobName Job has schedule' {
$Job.HasSchedule | Should Be 'True'
}
if($DontCheckJobOutcome -eq $false)
{
It '$JobName Job succeeded' {
$Job.LastRunOutCome | Should Be 'Succeeded'
}
}
So I have checked the agent and the jobs and now I want to check the folders exist. First for the instance using Test-Path so the user running the PowerShell session must have privileges and access to list the files and folders
1
2
3
4
Context '$Share Share For $Server' {
It 'Should have the root folder $Root' {
Test-Path $Root | Should Be $true
}
The for every database we need to set some variables for the Folder path. We don’t back up tempdb so we ignore that and then check if the server is SQL2012 or above and if it is check if the database is a member of an availability group and set the folder name appropriately
But we need some logic for checking for folders because Ola is smart and checks for Log Shipping databases so as not to break the LSN chain and system databases only have full folders and simple recovery databases only have full and diff folders. I used the System.IO.Directory Exists method as I found it slightly quicker for UNC Shares
If($CheckForDBFolders -eq $True)
{
Context 'Folder Check for $db on $Server on $Share' {
It 'Should have a folder for $db database' {
Test-Path $Dbfolder |Should Be $true
}
if($Db -notin ('master','msdb','model') -and ($Srv.Databases[$db].RecoveryModel -ne 'Simple') -and ( $LSDatabases -notcontains $db))
{
It 'Has a Full Folder' {
[System.IO.Directory]::Exists($Full) | Should Be $True
}
It 'Has a Diff Folder' {
[System.IO.Directory]::Exists($Diff) | Should Be $True
}
It 'Has a Log Folder' {
[System.IO.Directory]::Exists($Log) | Should Be $True
}
} #
elseif(($Srv.Databases[$db].RecoveryModel -eq 'Simple') -and $Db -notin ('master','msdb','model') -or ( $LSDatabases -contains $db) )
{
It 'Has a Full Folder' {
[System.IO.Directory]::Exists($Full) | Should Be $True
}
It 'Has a Diff Folder' {
[System.IO.Directory]::Exists($Diff) | Should Be $True
}
} #
else
{
It 'Has a Full Folder' {
[System.IO.Directory]::Exists($Full) | Should Be $True
}
}#
} # End Check for db folders
}
and a similar thing for the files in the folders although this caused me some more issues with performance. I first used Get-ChildItem but in folders where a log backup is running every 15 minutes it soon became very slow. So I then decided to compare the create time of the folder with the last write time which was significantly quicker for directories with a number of files but then fell down when there was a single file in the directory so if the times match I revert back to Get-ChildItem.
If anyone has a better more performant option I would be interested in knowing. I used Øyvind Kallstad PowerShell Conference session Chasing the seconds Slides and Video
and tried the methods in there with Measure-Command but this was the best I came up with
If($CheckForBackups -eq $true)
{
Context ' File Check For $db on $Server on $Share' {
$Fullcreate = [System.IO.Directory]::GetCreationTime($Full)
$FullWrite = [System.IO.Directory]::GetLastWriteTime($Full)
if($Fullcreate -eq $FullWrite)
{
It 'Has Files in the FULL folder for $db' {
Get-ChildItem $Full*.bak | Should Not BeNullOrEmpty
}
}
else
{
It 'Has Files in the FULL folder for $db' {
$FullCreate | Should BeLessThan $FullWrite
}
}
It 'Full File Folder was written to within the last 7 days' {
$Fullwrite |Should BeGreaterThan (Get-Date).AddDays(-7)
}
if($Db -notin ('master','msdb','model'))
{
$Diffcreate = [System.IO.Directory]::GetCreationTime($Diff)
$DiffWrite = [System.IO.Directory]::GetLastWriteTime($Diff)
if($Diffcreate -eq $DiffWrite)
{
It 'Has Files in the DIFF folder for $db' {
Get-ChildItem $Diff*.bak | Should Not BeNullOrEmpty
}
}
else
{
It 'Has Files in the DIFF folder for $db' {
$DiffCreate | Should BeLessThan $DiffWrite
}
}</div><div>It 'Diff File Folder was written to within the last 24 Hours' {
$Diffwrite |Should BeGreaterThan (Get-Date).AddHours(-24)
}
}
if($Db -notin ('master','msdb','model') -and ($Srv.Databases[$db].RecoveryModel -ne 'Simple') -and ( $LSDatabases -notcontains $db))
{
$Logcreate = [System.IO.Directory]::GetCreationTime($Log)
$LogWrite = [System.IO.Directory]::GetLastWriteTime($Log)
if($Logcreate -eq $LogWrite)
{
It 'Has Files in the LOG folder for $db' {
Get-ChildItem $Log*.trn | Should Not BeNullOrEmpty
}
}
else
{
It 'Has Files in the LOG folder for $db' {
$LogCreate | Should BeLessThan $LogWrite
}
}
It 'Log File Folder was written to within the last 30 minutes' {
$Logwrite |Should BeGreaterThan (Get-Date).AddMinutes(-30)
}
}# Simple Recovery
}
}# Check for backups
You could just run the script you have just created from your check-list, hopefully this blog post can help you see that you can do so.
But I like the message showing number of tests and successes and failures at the bottom and I want to use parameters in my script. I can do this like this
1
2
3
4
5
6
7
8
9
10
11
[CmdletBinding()]
## Pester Test to check OLA
Param(
$Instance,
$CheckForBackups,
$CheckForDBFolders,
$JobSuffix ,
$Share ,
[switch]$NoDatabaseRestoreCheck,
[switch]$DontCheckJobOutcome
)
and then call it using Invoke-Pester with the parameters like this
but that’s a bit messy, hard to remember and won’t encourage people newer to Powershell to use it so I wrapped it in a function with some help and examples and put it in GitHub Test-OlaInstance.ps1 and Test-Ola. There is one thing to remember. You will need to add the path to Test-Ola.ps1 on Line 90 of Test-OlaInstance so that the script can find it
Once you have that you can call it for a single instance or a number of instances like so. Here I check for Folders and Backup files
and get a nice result like this. In a little under 20 seconds I completed my checklist for 4 servers including checking if the files and folders exist for 61 databases 🙂 (The three failures were my Integrity Check jobs holding some test corrupt databases)
This gives me a nice and simple automated method of checking if Ola’s maintenance script has been correctly installed. I can use this for one server or many by passing in an array of servers (although they must use the same folder for backing up whether that is UNC or local) I can also add this to an automated build process to ensure that everything has been deployed correctly.