
I have a collection of over 400,000 documents that I need to update daily using a CSV file. Most of the time, only about 1% of the data changes, so I don't want to update everything. This collection is essential for our clients, mainly for full-text searches, so I want to keep downtime to a minimum. If you have any ideas on how to do this efficiently, please share them.
Recommended threads
- my database attribute stuck in processin...
when i created attributes in collection 3 of those attributes become "processing", and they are not updating, the worst thing is that i cant even delete them s...
- Error 1.7.4 console team no found
In console when i go to auth, select user, select a membership the url not work. Only work searching the team. It is by the region. project-default- and i get ...
- Is Quick Start for function creation wor...
I am trying to create a Node.js function using the Quick Start feature. It fails and tells me that it could not locate the package.json file. Isn't Quick Start ...
