Back

1M+ CSV row data to DB

  • 0
  • Databases
  • Cloud
DeFacedFace
24 Dec, 2024, 11:20

I have a csv that I intend to add to a collection in my database but that csv has millions of rows.

The process of adding each column and its rows under an attribute using the api would get me rate limited as it adds one by one. What path do i take?

TL;DR
Developers discussing how to efficiently add 1M+ rows from a CSV file to a database using appwrite. Suggestions include using async operations, not using GraphQL, adding rows in batches rather than individually to avoid rate limits, and utilizing the appwrite server SDK with an API key to prevent rate limiting.
Joshi
24 Dec, 2024, 11:25

Use the server sdk and an API key, this way you won't be rate limited

DeFacedFace
24 Dec, 2024, 11:30

hmm didn't know that... So the only thing I would have to deal with is the time it takes to add

Joshi
24 Dec, 2024, 11:32

Add them in batches of like 50 or so instead of awaiting all of them individually

DeFacedFace
24 Dec, 2024, 11:38

appwrite doesn't support batches... I think this is a requested feat or do you mean multi threads

Joshi
24 Dec, 2024, 11:39

No, appwrite does not support bulk updates. It does next release tho. What I meant was to not waiti for the result of each createDocument indiviudally but to run it concurrently

DeFacedFace
24 Dec, 2024, 11:44

gotcha! aysnc it is. I think i saw a mention somwhere that its possible with graphQL but ive never touched it

Joshi
24 Dec, 2024, 11:45

You do not need to use graphql for that.

Reply

Reply to this thread by joining our Discord

Reply on Discord

Need support?

Join our Discord

Get community support by joining our Discord server.

Join Discord

Get premium support

Join Appwrite Pro and get email support from our team.

Learn more