I want to know if there is a way to set the size of the bucket so that a user does not exceed its limit and to get that size later on. I want to simplify my checks on the file upload to limit users clogging up the storage. I could use a function to run a check on the files in a bucket and sum thier sizes but I am looking for a simpler way if possible.
nah you gotta use functions. Make a function that gets triggered on event when someone uploads a file. You will get the userId who triggered it and the file data. Save the data in a separate collection
Recommended threads
- Gitlab function automation
Im trying to automate my function deployment using gitlabs ci/cd pipeline. Im currently failing trying to use the appwrite cli to create a new deployment. ```...
- User Limit Bug
How can I disable Appwrite client SDK registration? I have heard the solution to set the User Limit to 1. But everytime I set it to one it will automatically se...
- Create file is listing files instead of ...
This is a bit strange but, I cannot create files from within a funcion. I have tested with latest SDKs versions of node and python, both instead of creating the...