Hello - I’m currently uploading data from a CSV to a collection in my database.
My current workflow is as follows:
1.) Convert CSV to JSON using Pandas 2.) Loop over each JSON object and upload using create_document()
I have over 50k rows of data which is taking way longer than I expected (two hours and only 15k records have been uploaded)
I’ve read that batch upload is a common feature request but all of these requests were made over a year ago.
For context: I’m using the Python Server SDK and API key.
Any help will go a really long way!
Found this. Still wanting to see if anyone has found/ can walk me through a more efficient solution
Recommended threads
- unlike any other provider, your sites DN...
your nameservers dont work with a funny little CNAME error, which is a weird bug but thats what you get when you try to be the kitchen sink, funny errors.
- Weird Table Causes Console to break
I dont even know how this even happened looks like the $createdAt and , $id got switched? <@831428608895615056>
- I recently switched to TablesDb. When li...
olddb.list_documents( queries =[ Query.order_desc("timestamp"), Query.equal("isPosted",[False]) ] ) Above works fine and reruns documents But below don't return...