How I deleted 530 million records from a large MySQL database table hitting nearly 200GB

Came across the need to perform maintenance on a MySQL database which holds millions of records, we needed to remove all data which was 2+ years old which involved deleting around 530 million records from a very large table, hitting nearly 200GB.


Due to the size of the database and transaction logs / table locking, it wasn’t possible to do a bulk delete such as

This caused the database to become very unresponsive and after hours of running eventually failed with table locking issues. This is because DELETE operations are very costly due to all the checking and transactional logging involved.

I tried creating a second duplicate table and inserting the data we wanted to keep into this table, however this involved turning the system offline during this process, after some testing locally, was going to take roughly 90 hours to complete, this was not an option.


I created a MySQL stored procedure, which allowed me to split the DELETE query into chunks, stored procedures are not handled as a single transaction, this allowed each delete operation to complete and move onto the next chunk, slowly deleting the data from the active table whilst the table is still in use.

As it was costly to query against the table or use LIMIT & OFFSET, which would take around 10 minutes to return results, it was not possible to use these as part of the procedure if we wanted this process to complete in a timely manner.

As the ID where incremental, I was able to query the table until I found the largest ID I was willing to delete up to, in my case it was 530,000,000.

The below stored procedure is what I ended up using to solve the issue.

It is recommended a professional under takes these actions, please always backup existing data / files / configs.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.