
Currently I'm using local storage to store all files users are uploading. I've thought moving those files to an external storage bucket provider, like S3 or Backblaze. Is it possible to move all files in a future, if so, how?

If you'll run
docker volume inspect appwrite_appwrite-uploads | grep Mountpoint
You'll see something like this
"Mountpoint": "/var/lib/docker/volumes/appwrite_appwrite-uploads/_data",
Inside you'll have folder for each app, then inside a folder for each of your buckets.
When replacing to a new driver like S3 for example, copy all the content of that folder and put as is in the S3 bucket.
To vailidate the right path, upload one file using the new provider then push the uploads content based as the same one.

@Binyamin Thanks as always! :appwritepeepo:

[SOLVED] Move storage buckets to external provider (like S3)

Can I transfer a file when execute a function?

Easiest way is transfer file first then executie the function
Recommended threads
- ENV vars not updating
When i do `nano .env` it shows `_APP_DOMAIN_TARGET=` as set to my domain, but when i do `docker compose exec appwrite vars` it shows `_APP_DOMAIN_TARGET=` as ...
- Index with the requested key already exi...
I'm using appwrite cli to create DB and I'm getting index_already_exists Is there a way to undestand the index name and maybe to skip if it's already exits?
- Hola equipo de soporte,
Hola equipo de soporte, Estoy desarrollando una Function en Appwrite Cloud con Node.js 22 y el siguiente package.json: { "name": "upload-whitelist", "type"...
