Hi guys, I need to import 20millions documents into the database, I am currently running some scripts that add the documents one by one. Its has been a week and I only managed to import 1million due to hitting rate limits very often.
What are the options to populate my collections a bit faster? I've seen some mentions that says the rate limit only applies for 1 connection and that I could have different connections for increasing it? in practice, how you guys implement that?
On the roadmap for 1.7 is bulk inserting, other than that I think this is all you can do. Depending on how you're inserting them you can look at doing concurrent insertions in your code.
Recommended threads
- One to many 2 way, console UI not correc...
Hey, seems I'm facing the exactly same issue with this one: https://github.com/appwrite/appwrite/issues/6016 Since this Github issue stay open for so long, let ...
- Scheduled works locking the entire Maria...
I have a scheduled function and apparently that or something is locking the entire MariaDB database and Appwrite is giving MariaDB errors. This error persists e...
- Need help to create a wrapper which let ...
I’m looking for help setting up Appwrite properly on a VPS so I can build a self-hosting wrapper around it. The goal is to provide a Linux executable that allow...