Hi guys, I need to import 20millions documents into the database, I am currently running some scripts that add the documents one by one. Its has been a week and I only managed to import 1million due to hitting rate limits very often.
What are the options to populate my collections a bit faster? I've seen some mentions that says the rate limit only applies for 1 connection and that I could have different connections for increasing it? in practice, how you guys implement that?
On the roadmap for 1.7 is bulk inserting, other than that I think this is all you can do. Depending on how you're inserting them you can look at doing concurrent insertions in your code.
Recommended threads
- Weird permission failure
when creating an account I use following methods: ``` Future<void> register(String email, String password, String username) async { final user = await accoun...
- Relation Question
How do I create a relation from table y to an others x.$id. in my example I have a users table where I use Appwrites unique User IDs and I want other tables fo...
- Unknown attribute type: varchar / text
Since the `string` type is deprecated I tried using `varchar` and `text` in some newer tables, but when running `appwrite pull tables && appwrite types ./src/li...