
I have two computers I work on. One windows pc and one mac. My problem is, that I want to sync my appwrite project between these two computers. I am not using Appwrite Cloud, I am using local hosting. Is there any way to share the docker volumes or something else from docker, so that I don't have to add new collections, attributes or other manually when switching computers again. PS: The last two months I have been working on my mac, so having a way to just copy the project and insert it on my other pc would save me a ton of time.

you would need to store the data in some shared file location...or make it so that one system can mount the other...but, honestly, that's messy and not worth it.

I have actually found a solution. Recently, docker has launched extensions for docker desktop. There is an extension that can create .tar files or similar out of docker volumes (where all the data is stored). I did this with every appwrite volume, copied them over with Logitech flow (alternatively you can use GitHub) and import them on the other docker desktop with the same extension. This only takes like 10 minutes. Worth it.

This scenario is more like migrating data between instances. for that, i recommend creating a backup from one instance and restoring to the other: https://www.youtube.com/watch?v=lM5yZEPtlvg
The problem with doing this volume route is there is a small chance of corrupting your database data

Luckily it was only dev, not prod xD But thank you very much for the link, will do it like this in the future

[SOLVED] Share Appwrite Docker across computers
Recommended threads
- Unable to View / Edit Bucket Files
Hi! I am unable to view / edit Bucket Files. While Previews work just fine, clicking the actual file to view or edit it produces the errors seen in the attache...
- How to remove the Sign up link after cre...
Greetings, i just installed appwrite on a VPS and created an account but now i do not want others to have access to the sign-up page. Is there any way to hide o...
- Does self-hosted Appwrite have read cach...
I wanted to switch from local storage to S3 (more specifically Backblaze B2), but I would like to have ~100GB on my VPS as read cache, so most frequently access...
