I would like to zip a large amount of files (several gigabytes) from a bucket and place that zip into another bucket via functions. How can i do this without using a lot of server memory?
Zip Gigabytes of files with functions
This would be very tricky In the cloud it will probably won't work due to storage size limitation In self-hosted is possible.
It's not possible to create and write the zip on the go directly to the bucket.
Recommended threads
- Worker functions stuck on "Fetched 0 fun...
Appwrite Version: 1.9.0 Bug Description: The appwrite-worker-functions container gets stuck in an infinite loop logging "Fetched 0 functions..." while scheduled...
- I am using s3 for app storage but is it ...
_APP_STORAGE_DEVICE=s3 puts everything to the s3 storage but i need to be able to keep the function builds and site in the local and not waste the cloud storage...
- Local Serverless Function Testing: Are D...
I have followed the instructions to get the CLI working, and have been able to log-in, initialize my project, and created a simple Python function, which calls ...