I have a collection of over 400,000 documents that I need to update daily using a CSV file. Most of the time, only about 1% of the data changes, so I don't want to update everything. This collection is essential for our clients, mainly for full-text searches, so I want to keep downtime to a minimum. If you have any ideas on how to do this efficiently, please share them.
Recommended threads
- Locked out of account
Hey guys, I have a paid account and have been locked out. Apologies for using this method, but I'm getting no response via the contact us page. I had a old do...
- Migration from cloud to self-hosted fail...
Hi! I'm trying to migrate a small project from the cloud to a self hosted instance to play around but without any success! The migration process fails with the ...
- Feedback and Deployment Challenges with ...
Hello world!, I've been developing a project using FastAPI in Python, and I was considering deploying it through Appwrite Functions. However, I encountered a f...