I have a collection of over 400,000 documents that I need to update daily using a CSV file. Most of the time, only about 1% of the data changes, so I don't want to update everything. This collection is essential for our clients, mainly for full-text searches, so I want to keep downtime to a minimum. If you have any ideas on how to do this efficiently, please share them.
Recommended threads
- Tips for Debugging Appwrite Functions Lo...
Hi everyone! 👋 I have an Appwrite Function running locally with Docker, but I’m struggling to debug it because execution doesn’t reach the breakpoints I set. ...
- AttributeError: 'Context' object has no ...
I'm getting an error executing my function. I'm not able to replicate this locally since I have to use a mock context. Is there a way to debug this kind of erro...
- SSR share session to client using custom...
Hi, so I was trying to get a hang of using SSR and using realtime updates in the same time which is done easiest if you have a custom domain in Appwrite and as ...