As a developer, you may come across situations where you need to delete duplicate records from a table in SQL Server. However, it is important to approach this task with caution and ensure that the data being deleted is indeed duplicate and not required for any business purposes.
Before deleting duplicate data, it is recommended to first select and review the duplicate records to confirm their duplicity. This can be done using a simple SQL script. In the script provided below, we assume that the table has a unique incremental ID and that we want to keep the latest record in case of duplicates:
USE tempdb
GO
-- Selecting Data
SELECT * FROM TestTable
-- Detecting Duplicate
SELECT NameCol, COUNT(*) TotalCount
FROM TestTable
GROUP BY NameCol
HAVING COUNT(*) > 1
ORDER BY COUNT(*) DESC
-- Deleting Duplicate
DELETE FROM TestTable
WHERE ID NOT IN (
SELECT MAX(ID)
FROM TestTable
GROUP BY NameCol
)
-- Selecting Data
SELECT * FROM TestTable
DROP TABLE TestTable
GO
If your requirement is to maintain only unique records in the column, it is advisable to create a unique index on the column. This will prevent users from entering duplicate data into the table from the beginning, ensuring data integrity. However, if you realize the need to keep only unique records after data has already been entered, you will need to delete the duplicate records before creating the unique constraint.
Deleting duplicate records can be achieved by using the script provided above. It selects the maximum ID for each unique value in the column and deletes all other records with the same value. This ensures that only the latest record is retained.
It is important to note that deleting duplicate records should be done with caution and after thorough analysis of the data. Always make sure to have a backup of the data before performing any deletion operations.
By following these steps, you can effectively delete duplicate records from your SQL Server table and ensure data integrity.
Thank you for reading! If you have any suggestions or topics you would like to see covered in future articles, please let us know.