
Hello - I’m currently uploading data from a CSV to a collection in my database.
My current workflow is as follows:
1.) Convert CSV to JSON using Pandas 2.) Loop over each JSON object and upload using create_document()
I have over 50k rows of data which is taking way longer than I expected (two hours and only 15k records have been uploaded)
I’ve read that batch upload is a common feature request but all of these requests were made over a year ago.
For context: I’m using the Python Server SDK and API key.
Any help will go a really long way!

Found this. Still wanting to see if anyone has found/ can walk me through a more efficient solution
Recommended threads
- Password Recovery link takes upwards of ...
Hello. I am having this issue above. Is there a way to make this faster? I created this project a while back when appwrite only supported Frankfurt servers. Wil...
- Best approach for handling users (creati...
I found out appwrite is wayy different to supabase, so i just wanted to check my approach is correct. Normally when creating user, I'd have something like a t...
- Appwrite website opening issue on mobile...
<@462046107556511744> i am facing issue in opening Chrome on my mobile and also on Chrome in laptop
