Published on

June 19, 2022

Backing up SQL Server Log Data

As a SQL Server administrator, you may have come across the need to retain log data for a longer period of time than what is provided by default. In AWS Redshift, the STL log views only retain approximately two to five days of log history. If you want to retain the log data for a longer period, you will need to periodically copy it to other tables or unload it to Amazon S3.

In this article, we will discuss a simple solution to backup the log data from the STL log views in AWS Redshift. We will focus on the five main system log views/tables and create backup tables to store the data.

To begin, we create a new schema called “Audit” using the following script:

CREATE SCHEMA audit;

Once the schema is created, we can proceed to create the backup tables for the log data. We will create five tables, each corresponding to one of the system log views. Here is an example of creating one of the backup tables:

CREATE TABLE "audit"."bck_svl_stored_proc_call"
(
  -- columns here
  modifieddatetime timestamp without time zone NOT NULL DEFAULT getdate()
);

We create the backup tables with the same column names as the original log views, with an additional column called “modifieddatetime”. This column will capture the time when a record was inserted into the backup table, allowing us to monitor the backup process and identify any issues.

After creating the backup tables, we can create a stored procedure to periodically update these tables with data from the system log views. Here is an example of the stored procedure:

CREATE OR REPLACE PROCEDURE audit.usp_copy_svl_tables () LANGUAGE plpgsql AS $$
BEGIN
  -- insert statements here
EXCEPTION WHEN OTHERS THEN
  RAISE INFO 'An exception occurred / Error: % / Message: %', SQLSTATE, SQLERRM;
END;
$$

The stored procedure uses INSERT statements to copy data from the system log views to the backup tables. It checks for existing records in the backup tables to avoid duplicating data. The procedure can be scheduled to run periodically using the AWS Redshift Query Scheduler.

Once the stored procedure is created, we can schedule its execution using the following command:

CALL audit.usp_copy_svl_tables ();

This command will execute the stored procedure and copy the log data to the backup tables. By scheduling this command to run on a daily basis, we can ensure that the backup tables are updated regularly with the latest log data.

By following this simple solution, you can retain the log data from the STL log views for a longer period of time and have a backup in case you need to analyze or troubleshoot any issues in the future.

If you have any suggestions or improvements to this process, please feel free to share them with us. We are always looking for ways to enhance our backup and data retention strategies.

Click to rate this post!
[Total: 0 Average: 0]

Let's work together

Send us a message or book free introductory meeting with us using button below.