Deploying the service myself, I execute the official backup script given every day and upload it to my network drive. I found that the dump.sql generated a few days ago was only about 8M, but the dump.sql file generated yesterday was up to 80M I took a brief look at the sql content and it saved a lot of logs, is it possible for appwrite to clean up the logs automatically? If it keeps going like this, the backup will become very big.
Are you referring to this executions logs?
yes
we got 10k users in 20days , There are many logs generated by users creating accounts, signing in, etc. I think it is sufficient to keep them for 30 days or some other length of time. At the moment it looks like it's always saved?
Change the value of _APP_MAINTENANCE_RETENTION_EXECUTION to small value (other then 1209600 that represent 14 days in seconds) in your .env file
As of now every execution older then 14 days will get deleted.
Check here for more details about env variables https://appwrite.io/docs/environment-variables#maintenance
After that run docker compoe down && docker compose up -d
Be aware that if you'll enter 432000 as 5 days worth of seconds, every exeuction older then 5 days will get deleted.
Thank you very much for your suggestions!
[SOLVED]How to clean Functions logs
Recommended threads
- 1:1 relationship doesn’t sync after re-a...
Hi, I’m trying to use a two-way one-to-one relationship. It works fine when I create a record with the relationship set, and it also works when I unset it. But ...
- Upsert with setting permissions
Hi there, I am using self-hosted appwrite v1.7.4 and trying to use the bulk update stuff that was released with 1.7.x. Unfortunally I found that there is an ser...
- Github connection error - Self Hosted Ap...
I am trying to connect my github account to deploy a project to sites, but Connect to Gtihub button is disabled. I have configured all Env variables needed for...