
Deploying the service myself, I execute the official backup script given every day and upload it to my network drive. I found that the dump.sql generated a few days ago was only about 8M, but the dump.sql file generated yesterday was up to 80M I took a brief look at the sql content and it saved a lot of logs, is it possible for appwrite to clean up the logs automatically? If it keeps going like this, the backup will become very big.

Are you referring to this executions logs?

yes

we got 10k users in 20days , There are many logs generated by users creating accounts, signing in, etc. I think it is sufficient to keep them for 30 days or some other length of time. At the moment it looks like it's always saved?

Change the value of _APP_MAINTENANCE_RETENTION_EXECUTION
to small value (other then 1209600
that represent 14 days in seconds) in your .env
file
As of now every execution older then 14 days will get deleted.
Check here for more details about env variables https://appwrite.io/docs/environment-variables#maintenance

After that run docker compoe down && docker compose up -d

Be aware that if you'll enter 432000 as 5 days worth of seconds, every exeuction older then 5 days will get deleted.

Thank you very much for your suggestions!

[SOLVED]How to clean Functions logs
Recommended threads
- Origin error after changing default port...
Hi! I need some help regarding an issue I’m facing with Appwrite after changing the default ports. I have a self-hosted Appwrite instance running on my VPS. I ...
- Opened my website after long time and Ba...
I built a website around a year back and and used appwrite for making the backend. At that time the website was working fine but now when i open it the images a...
- CSV Import not working
I am running 1.7.4, trying the sample book csv import. I get a pop up saying import started and then a quick follow up saying completed. Nothing ever appears ...
