Hello - I’m currently uploading data from a CSV to a collection in my database.
My current workflow is as follows:
1.) Convert CSV to JSON using Pandas 2.) Loop over each JSON object and upload using create_document()
I have over 50k rows of data which is taking way longer than I expected (two hours and only 15k records have been uploaded)
I’ve read that batch upload is a common feature request but all of these requests were made over a year ago.
For context: I’m using the Python Server SDK and API key.
Any help will go a really long way!
Found this. Still wanting to see if anyone has found/ can walk me through a more efficient solution
Recommended threads
- Need Help with Google OAuth2 in Expo usi...
I'm learning React Native with Expo and trying to set up Google OAuth2 with Appwrite. I couldn't find any good docs or tutorials for this and my own attempt did...
- Got message for auto payment of 15usd fo...
how did this happen? 1. i claimed my 50usd credits via jsm hackathon - https://hackathon.jsmastery.pro/ 2. it asked me which org. to apply the credits on, i se...
- Apple OAuth Scopes
Hi Hi, I've configured sign in with apple and this is the response i'm getting from apple once i've signed in. I cant find anywhere I set scopes. I remember se...