
I have an application set up with Appwrite Cloud, and the same database and storage structure set up on Gitpod. I am seeing a 500 error when I upload a file to storage on Cloud, but the Gitpod instance uploads just fine. (I can also manually upload the file to Cloud in the Console, with no problems.) I haven't had any issues with other file uploads on Cloud. The error response on Cloud is very generic:
{
"message": "Server Error",
"code": 500,
"type": "general_unknown",
"version": "0.11.12"
}
Is there any way to see a more detailed error message? My project ID is 6423a55a17c8acba9dd6.

Does this issue still persists?

Yes

Where are you getting the error message?

are you sure this is your project id on cloud?

6423a55a17c8acba9dd6, I just double checked that I was in Appwrite Cloud and not Gitpod

weird...i don't see any errors related to this project

Maybe it's a cache/permissions issue 🤔

I'm pretty sure it's not a permissions issue, as I'm able to upload other files. How would I diagnose a cache issue?

By deleting browser cache

My file upload isn't directly from a file input. I'm creating a js File object using a JSON object and passing that to the createFile() function. Could that have anything to do with it? It's still only 65K, well under the file size limit of my bucket.

65 kB?

Yes, thereabouts. 65,237 bytes

What's the code?

storage.createFile(reportStorageId, ID.unique(), new File([JSON.stringify(days)], 'upload.json', { type: 'application/json', }))
days
is an array of objects

Hmm this is the web sdk?

Yes, it's a React project

I'm not so sure this will work...it's supposed to be a selected file from the filesystem.
How big is the JSON?

This file is 65,237 bytes. Most of the files are between 30 and 60 KB. This is particularly strange, because there are larger files (one is 69 KB) that have uploaded successfully in the past. And as I mentioned, this same file -- which is resulting in a 500 error -- uploads fine in my Gitpod instance.

For the sake of clarity, my use-case is: the user uploads a .csv file. My app parses that file and generates a JSON object from its data. That JSON is uploaded to Appwrite Storage.

Ummm so is this upload happening server side?

No, the js is run on the client. Appwrite Cloud is the only server. Edit: okay, the React code is on Netlify. But there's no server processing there.

Since I'm uploading files in a manner not officially supported (or expected), I have switched to storing the JSON in my database table directly. Through trial and error, I found that the max size of a string field is 1,073,741,824. Therefore, I am checking the length of the JSON string before attempting the write. I don't expect any JSON string to exceed that limit, however.

if you want to debug further...it would help if you shared the HTTP request (from the network logs of the browser dev tools) for both cloud and gitpod

apologies for the delay in getting back to you 🙈
Recommended threads
- my database attribute stuck in processin...
when i created attributes in collection 3 of those attributes become "processing", and they are not updating, the worst thing is that i cant even delete them s...
- Is Quick Start for function creation wor...
I am trying to create a Node.js function using the Quick Start feature. It fails and tells me that it could not locate the package.json file. Isn't Quick Start ...
- Forever Processing Issue
I encountered an issue when creating attributes in the collections . if you create an attribute of type string for example and choose a size of 200 or 250 or a...
