Skip to content
Back

Large File Upload Issue with S3/RustFS

  • 0
  • Self Hosted
  • Storage
_alnes_
7 May, 2026, 20:36

Hi, we are seeing a reproducible large upload failure with Appwrite 1.8.0 using S3-compatible storage through RustFS.

A file upload of about 10.7 GB consistently fails around the 1651st 5 MB chunk with a 500 error. In the logs we also saw: Invalid document structure: Attribute "metadata" has invalid type. Value must be a valid string and no longer than 75000 chars.

Our current understanding is that Appwrite stores multipart upload state in the internal files.metadata field, and that this field becomes too large when many chunks are uploaded. We also checked the current state and found:

Appwrite 1.9.0 is out, but we could not find a clear fix for this in the release notes in the 1.9.0 source, files.metadata still seems to be limited to 75000 the 1.9.0 migration even appears to set bucket metadata to 65534 So our question is: is this understanding actually correct, or are we missing something? If the upload logic is still the same, it looks like this issue would happen earlier in 1.9.0 rather than later.

Is there an official fix or recommended workaround for this?

thanks and best regards Matthias

TL;DR
Developers are experiencing large file upload failures with Appwrite 1.8.0 using S3-compatible storage through RustFS. The issue occurs around the 1651st 5 MB chunk due to the files.metadata field becoming too large. In Appwrite 1.9.0, the field limit still stands at 75000 characters. The recommended workaround is to split the large file into smaller parts to avoid hitting this limit.
Reply

Reply to this thread by joining our Discord

Reply on Discord

Need support?

Join our Discord

Get community support by joining our Discord server.

Join Discord

Get premium support

Join Appwrite Pro and get email support from our team.

Learn more