
Hi guys, I need to import 20millions documents into the database, I am currently running some scripts that add the documents one by one. Its has been a week and I only managed to import 1million due to hitting rate limits very often.
What are the options to populate my collections a bit faster? I've seen some mentions that says the rate limit only applies for 1 connection and that I could have different connections for increasing it? in practice, how you guys implement that?

On the roadmap for 1.7 is bulk inserting, other than that I think this is all you can do. Depending on how you're inserting them you can look at doing concurrent insertions in your code.
Recommended threads
- $updatedAt hasn't changed at all.
I found an issue: if a document is updated but the actual value of the corresponding field doesn't change, $updatedAt will not be updated. Is the system designe...
- NIOHTTPCompression-Error
Hi, I have self-hosted Appwrite via Coolify on a Ubuntu-VPS . I am accessing the Database via Swift. Everything is working as expected, but with one collection ...
- Bulk Update Error
Getting Following error when updating a document: ``` Failed to Update User: Bulk update is not supported for collections with relationship attributes. ``` In...
