Hello - I’m currently uploading data from a CSV to a collection in my database.
My current workflow is as follows:
1.) Convert CSV to JSON using Pandas 2.) Loop over each JSON object and upload using create_document()
I have over 50k rows of data which is taking way longer than I expected (two hours and only 15k records have been uploaded)
I’ve read that batch upload is a common feature request but all of these requests were made over a year ago.
For context: I’m using the Python Server SDK and API key.
Any help will go a really long way!
Found this. Still wanting to see if anyone has found/ can walk me through a more efficient solution
Recommended threads
- Bulk delete failed with 401
- I created a transaction to bulk delete rows in a table has `done` equal `true` follow documentation. But when run, it returns 401 unauthorized error as screen...
- In which format should i pass the date?
I have a column with the type `datetime`. So i want to know which format is suitable for passing the date
- Oauth issue
Hi, can anyone help me in the Oauth issue, Its working fine in dev but its showing Invalid redirect during production. I have check the redirect url and all. St...