
Deploying the service myself, I execute the official backup script given every day and upload it to my network drive. I found that the dump.sql generated a few days ago was only about 8M, but the dump.sql file generated yesterday was up to 80M I took a brief look at the sql content and it saved a lot of logs, is it possible for appwrite to clean up the logs automatically? If it keeps going like this, the backup will become very big.

Are you referring to this executions logs?

yes

we got 10k users in 20days , There are many logs generated by users creating accounts, signing in, etc. I think it is sufficient to keep them for 30 days or some other length of time. At the moment it looks like it's always saved?

Change the value of _APP_MAINTENANCE_RETENTION_EXECUTION
to small value (other then 1209600
that represent 14 days in seconds) in your .env
file
As of now every execution older then 14 days will get deleted.
Check here for more details about env variables https://appwrite.io/docs/environment-variables#maintenance

After that run docker compoe down && docker compose up -d

Be aware that if you'll enter 432000 as 5 days worth of seconds, every exeuction older then 5 days will get deleted.

Thank you very much for your suggestions!

[SOLVED]How to clean Functions logs
Recommended threads
- Attribute creation stuck at processing f...
Hey Appwrite community! 👋 I'm running into a persistent issue with my self-hosted Appwrite installation (v1.6.1) where programmatically creating attributes fo...
- Could not open input file: /usr/src/code...
I updated my selfhosted instance to `1.6.2` and the new service (`appwrite-task-stats-resources`) failed to start. The only log message I see is: ``` Could not ...
- functions page returns 500
I am running selfhosted appwrite version 1.6.0 and all of a sudden my functions page stopped working, returning a 500. I don't see anything in the logs that wo...
