Deploying the service myself, I execute the official backup script given every day and upload it to my network drive. I found that the dump.sql generated a few days ago was only about 8M, but the dump.sql file generated yesterday was up to 80M I took a brief look at the sql content and it saved a lot of logs, is it possible for appwrite to clean up the logs automatically? If it keeps going like this, the backup will become very big.
Are you referring to this executions logs?
yes
we got 10k users in 20days , There are many logs generated by users creating accounts, signing in, etc. I think it is sufficient to keep them for 30 days or some other length of time. At the moment it looks like it's always saved?
Change the value of _APP_MAINTENANCE_RETENTION_EXECUTION to small value (other then 1209600 that represent 14 days in seconds) in your .env file
As of now every execution older then 14 days will get deleted.
Check here for more details about env variables https://appwrite.io/docs/environment-variables#maintenance
After that run docker compoe down && docker compose up -d
Be aware that if you'll enter 432000 as 5 days worth of seconds, every exeuction older then 5 days will get deleted.
Thank you very much for your suggestions!
[SOLVED]How to clean Functions logs
Recommended threads
- Local appwrite run functions --user-id n...
Hi, I'm running into an issue when testing Appwrite functions locally with user impersonation. I'm using a self-hosted Appwrite instance and running functions ...
- Selfhosted Github App installation
I've followed this guide: https://appwrite.io/docs/advanced/self-hosting/configuration/version-control to connect GitHub to my self-hosted Appwrite instance (1....
- User ID case sensitivity
I see that through REST (and SDK as well), getting a user is not case sensitive. And even though documentation does not clearly state that it is, the wording "V...