
Hello - I’m currently uploading data from a CSV to a collection in my database.
My current workflow is as follows:
1.) Convert CSV to JSON using Pandas 2.) Loop over each JSON object and upload using create_document()
I have over 50k rows of data which is taking way longer than I expected (two hours and only 15k records have been uploaded)
I’ve read that batch upload is a common feature request but all of these requests were made over a year ago.
For context: I’m using the Python Server SDK and API key.
Any help will go a really long way!

Found this. Still wanting to see if anyone has found/ can walk me through a more efficient solution
Recommended threads
- Change of billing cycle to support start...
Hii...is there any way to change my billing cycle from 20th to 1st...so that I aligns with my requirements.It becomes easier to track monthly usage crctly. I am...
- I am getting a 401 unauthorized response...
I have a Next.js application that stores user PDFs. I'm able to save them normally, but when trying to access the files using getFileView, I get a 401 Unauthori...
- Cloud Console: Internal Error 500
I am unable to access most pages except overview. project: project-fra-siloassist
