
Mmm, this one is 20K long Meaning is more then the 8192 chars limit.

yes chatgpt says 10K π


With this amount of data you'll need to choose a different approach

You'll need to use the Appwrite Storage module.
- Upload the file to storage using
createFile
https://appwrite.io/docs/server/storage?sdk=web-default#storageCreateFile - In the function fetch the file using
getFile
https://appwrite.io/docs/server/storage?sdk=web-default#storageGetFile - Then, just delete it https://appwrite.io/docs/server/storage?sdk=web-default#storageDeleteFile

Or

Okay, I feared that π I tried to find a solution without storage.

You can dived the Base64 string into 8000 chars chunks, then in the function you can try do maybesomthing like this
for(let chunk in chunks){
await this.appwrite.functions.createExecution(functionId, JSON.stringify({lastChunk: false, chunk: chunk }), async);
}

It's possible but will require some extra work from your side.

And its actually good option if you want to avoid storage.

but thanks for your help. I will try to split the array into chunks less hen 8000 chars. If i find something. I can share it with you

π

But why is there a limitation from appwrrite. Do you know it?

Probably against abuse

hmm, okay. but for self host there should be a option to change the limitations. but its okay. We can find something

You are on self host?

yes

im not using the cloud

If you want to change it in a self-hosted one, you'll need to edit these two lines https://github.com/appwrite/appwrite/blob/master/app/executor.php#L462 https://github.com/appwrite/appwrite/blob/master/app/controllers/api/functions.php#L1025
In order to edit and leave your Appwrite up to date, like you can see here https://discord.com/channels/564160730845151244/1120778518360498276/1120802097277964358
Also check this https://discord.com/channels/564160730845151244/1115191629733703730/1115262952405225552 https://discord.com/channels/564160730845151244/1115191629733703730/1115267558636007445

Change it to let's say 100000 You can still use the zip choice to get shorter upload times.

nice, I will try that definitly. Big thanks. That would help. I will keep you up-to-date

the response payload is also limited. It trims in the and that affacting the large object. That means the response payload cuts after some limit. You know in the code where to change it?

What error do you get?

Any time you see this line
\mb_strcut($res, 0, 1000000), // Limit to 1MB
Like in here https://github.com/appwrite/appwrite/blob/master/app/executor.php#L578 It's when the limit of the response is set 1MB, just change it to whatever number you like.

But, do notice this line, that saves the response into the database https://github.com/appwrite/appwrite/blob/master/app/controllers/api/functions.php#L1175 As it have only 1MB limit https://github.com/appwrite/appwrite/blob/master/app/config/collections.php#L2695
So you'll also to find a way around it, either by change the length, or simply just just command out the insertion line.
Recommended threads
- list() is very slow; eventually shows no...
When I use the web browser to view the collections in my database, the documents they contain are normally displayed within a few seconds. For a few days now, h...
- Can't start docker containers beacuse no...
Hi, I'm testing my app where I upload also files to appwrite storage. As I suspect, I've filled up my disk. Now I can't start Appwrite. Could I ask for help wit...
- Problems with adding my custom domain
