SSIS – Automating SQL Server Backups, copying files and checking that they were copied
I had an issue where I lost all my backups. So what I have now created is an automated SQL Server backup process, which will back up all the databases, verify the backups, copy them to the required location. And then finally email the backup files copied to ensure that they are all there.
Backing up the databases
1. Below is the script which is created on the master Database.
2. This backs up the databases all except the TempDB to the desired location.
3. NOTE: WE ARE ALWAYS USING THE SAME FILENAMES AND OVERWRITING THE CURRENT BAK FILE, BECAUSE WE ONLY NEED THE LATEST COPY.
a. THIS ALSO SAVES SPACE FOR THE BACKUP FILES.
4. Here is the example of the script to run to create it, and some notes afterwards
USE[master]
GO
/****** Object: StoredProcedure [dbo].[prc_BackupDatabases] Script Date: 2013-06-06 01:58:41 PM ******/
SETANSI_NULLSON
GO
SETQUOTED_IDENTIFIERON
GO
— =============================================
— Author: Gilbertq
— Create date: 07 June 2013
— Description:
— =============================================
CREATEPROCEDURE[dbo].[prc_BackupDatabases]
AS
BEGIN
SETNOCOUNTON;
DECLAREUserDatabases_CTE_CursorCursor
FOR
— Selecting user database names.
selectnameasDatabaseName
fromsys.sysdatabases
where ([dbid])<> 2
OPENUserDatabases_CTE_Cursor
DECLARE@dbNamevarchar(100);
DECLARE@backupPathvarchar(100);
DECLARE@backupQueryvarchar(500);
— make sure that the below path exists
set@backupPath=‘C:\SQLBackups\’
FetchNEXTFROMUserDatabases_CTE_CursorINTO@dbName
While (@@FETCH_STATUS<>–1)
BEGIN
— Backup SQL statement
set@backupQuery= ‘backup database ‘+@dbName+‘ to disk = ”’+@backupPath+‘SERVERNAME-‘++@dbName +‘.bak” WITH COMPRESSION,INIT’
— Print SQL statement
print@backupQuery
— Execute backup script
–Select (@backupQuery)
EXEC (@backupQuery)
— Get next database
FetchNEXTFROMUserDatabases_CTE_CursorINTO@dbName
END
CLOSEUserDatabases_CTE_Cursor
DEALLOCATEUserDatabases_CTE_Cursor
SETNOCOUNTOFF;
END
GO
a. NOTE: You will have to modify the following in the above script for it to run on your environment:
i. @backupPath
1. Change this to where you want to back up your SQL BAK Files.
ii. Within the @backupQuery we have also put in the SERVERNAME into the BackupFile name
1. This was because we were backing up multiple BAK files from multiple servers.
b. This is what it looks like in SSIS
Verify the SQL Server Backups
1. The next step is to then verify that the backups are consistent and can be restored.
2. Again we created this script on the Master database
3. Here is the create script:
USE[master]
GO
/****** Object: StoredProcedure [dbo].[prc_VerifyDatabases] Script Date: 2013-06-06 02:09:05 PM ******/
SETANSI_NULLSON
GO
SETQUOTED_IDENTIFIERON
GO
— =============================================
— Author: Gilbertq
— Create date: 07 Jun 2013
— Description:
— =============================================
CREATEPROCEDURE[dbo].[prc_VerifyDatabases]
AS
BEGIN
SETNOCOUNTON;
DECLAREUserDatabases_CTE_CursorCursor
FOR
— Selecting user database names.
selectnameasDatabaseName
fromsys.sysdatabases
where ([dbid])<> 2
OPENUserDatabases_CTE_Cursor
DECLARE@dbNamevarchar(100);
DECLARE@backupPathvarchar(100);
DECLARE@backupQueryvarchar(500);
— make sure that the below path exists
set@backupPath=‘C:\SQLBackups\’
FetchNEXTFROMUserDatabases_CTE_CursorINTO@dbName
While (@@FETCH_STATUS<>–1)
BEGIN
— Backup SQL statement
set@backupQuery= ‘Restore VERIFYONLY from disk = ”’+@backupPath++‘SERVERNAME-‘+@dbName +‘.bak” ‘
— Print SQL statement
print@backupQuery
— Execute backup script
–Select (@backupQuery)
EXEC (@backupQuery)
— Get next database
FetchNEXTFROMUserDatabases_CTE_CursorINTO@dbName
END
CLOSEUserDatabases_CTE_Cursor
DEALLOCATEUserDatabases_CTE_Cursor
SETNOCOUNTOFF;
END
GO
a. NOTE: You will have to modify the following in the above script for it to run on your environment:
i. @backupPath
1. Change this to where you backed up your SQL BAK Files.
ii. Within the @backupQuery we have also put in the SERVERNAME into the BackupFile name
1. This was because we were backing up multiple BAK files from multiple servers.
b. This is what it looked like in SSIS
Copying files to backup location on the network
1. Here we are then copying our backup files to a network location where they are then backed up.
2. It is a simple File System Task, where we are copying all the files from one folder to another folder.
a. The only thing to NOTE is that we are overwriting the files when we copy them over.
3. This is what it looks like, simple to setup and create.
Getting the file list and emailing it to the required users
1. The final part is where I want to verify that the backups were backed up and copied over to the network location.
2. The first part is using the PowerShell script to get the file listing into CSV.
a. NOTE: I created a mapped drive to our backup location in order to make it easier in the script.
i. And to also ensure that there is no authentication issues.
3. All the files were saved in a Folder called PowerShell
4. This is the PowerShell script that was used to get the file listing and exported to CSV
Get-ChildItem r:\*.* -include SERVERNAME*.* | select name,length,LastWriteTime | Export-Csv C:\SQLBackups\Powershell\SERVERNAME-BackupFileList.csv
a. This was saved with the following filename:
i. Filelisting_SERVERNAME.ps1
5. Next this is how we configured it to run the PowerShell script in SSIS
a. Drag in an Execute Process Task and configure it with the following:
i. We gave it the following name:
1. PowerShell to get Directory Listing for SERVERNAME Files
ii. Then click on the Process on the left hand side.
iii. This is where you actually configure how you will run the Execute Process
1. Where it says Executable you have to put in the location for your executable.
2. In our example this is the location for PowerShell
C:\Windows\System32\WindowsPowerShell\v1.0\PowerShell.exe
3. Next is the Arguments that you want to pass with your executable
4. In our example we put in the PowerShell script to call:
-ExecutionPolicy ByPass -command “. ‘M:\SQLBackups\Powershell\Filelisting_SERVERNAME.ps1′”
b. You can leave all the other defaults so that it looks like the following below:
6. The next step is to then go through the process of Importing the data from the CSV file into the table called:
Staging.tb_BackupFileListing
a. NOTE: In this table we store when the file was modified as well as when data was imported into the table.
7. We then import this data into the Fact Table so that we have got the history stored, into the table called:
a. TF_BackupFileListing_Converted
8. We then export just the current Backup File Listing data into an Excel Spread sheet which is saved locally into the PowerShell Folder.
9. The final step is where we then attach the exported spread sheet and email it to the required people.
10. This is what this package looks like:
Hello myfriendjoobs,
Being an accidental DBA – Your idea of copying DataBases backup files using standard tools is very interesting.
Have a few questions:
#1 You talking about coping .BAK files only, and .trn files are not a part of the solution right?
#2 Is that possible to have an entire package along with series of table scripts that support your solution, for download?
At any rate –
Thank you for taking the time to put your solution together and share it with rest of us
David
Hi there David,
You are correct that currently it is only the BAK files and it might be possible if you look at the script above to include the TRN files. You could do this by changing the backup script.
I will look at getting this done soon, to upload the package online so that you can download the solution.
Thanks
Gilbert
Thank you very much Gilbert – Look forward for SSIS package availability