I have a collection of over 400,000 documents that I need to update daily using a CSV file. Most of the time, only about 1% of the data changes, so I don't want to update everything. This collection is essential for our clients, mainly for full-text searches, so I want to keep downtime to a minimum. If you have any ideas on how to do this efficiently, please share them.
TL;DR
Title: Efficiently importing and updating 400k+ documents
Message: I am a developer seeking help with efficiently importing and updating over 400,000 documents daily using a CSV file. The majority of the data remains the same, and I want to minimize downtime as this collection is vital for our clients' full-text searches. Any suggestions for an efficient solution?