How Can I Clean My Database? It's Huge (50GB+)

My database has grown to over 50GB, and I suspect a lot of it is logs, old data, or unused records. I want to clean it up without causing issues.

A few questions:

  • What are the best practices for identifying and deleting unnecessary data?
  • Are there common log tables or temporary data I should check?
  • Any specific SQL queries or tools that can help analyze large tables?
  • Should I consider archiving old data instead of deleting it?

The database is running on PostgreSQL.

Thanks in advance!

Hi @Romeo111

If you would like to preserve the data, I would suggest to connect to S3, If not I would reduce the execution retention period

Execution period is configured from environment variables:

Configure S3

Yes, I done it and reduce the execution retention period, but data/log before that period is still in the database.