
Hello - I’m currently uploading data from a CSV to a collection in my database.
My current workflow is as follows:
1.) Convert CSV to JSON using Pandas 2.) Loop over each JSON object and upload using create_document()
I have over 50k rows of data which is taking way longer than I expected (two hours and only 15k records have been uploaded)
I’ve read that batch upload is a common feature request but all of these requests were made over a year ago.
For context: I’m using the Python Server SDK and API key.
Any help will go a really long way!

Found this. Still wanting to see if anyone has found/ can walk me through a more efficient solution
Recommended threads
- This morning the project was deleted.
Hello, We had a project transfer a month ago and my customer told me that the project was deleted today. How can I get help for this. Yours sincerely,
- Database error
My code: await databases.createDocument( process.env.APPWRITE_DATABASE, process.env.APPWRITE_COLLECTION_USER, data.userId, ...
- API preflight request not working on .f...
When I am calling a function on my APP through the domain is failing. Because the preflight request (OPTIONS HTTP request) times out. this only occurs with fu...
