Back

[Solved] Functions - Bad request

  • 0
  • Self Hosted
  • Functions
  • REST API
fafa
26 Jan, 2024, 12:44

Currently I barely use functions, only if users update themselves or for now the product importing

TL;DR
The developer was experiencing issues with their functions, specifically with timeouts and bad requests. They received help from others in the thread and made changes such as increasing chunk sizes, changing concurrency, and adjusting CPU cores. They were able to solve the issue by slowing down their code and reducing concurrency. The cause of the issue was the freezing of the API due to the number of CPU cores on the server. Future improvements are planned to prevent these freezes. The developer also encountered a protocol error and received assistance in debugging it. Overall, the issue was resolved and the functions are now working properly.
fafa
26 Jan, 2024, 12:45

I mostly use nextjs, so the api routes do it for me instead of functions :)

fafa
26 Jan, 2024, 12:45

Upping the chunk limit did not speed it up at all haha

fafa
26 Jan, 2024, 12:46

[Solved] Functions - Bad request

D5
26 Jan, 2024, 12:47

What do you mean with chunk limit?

fafa
26 Jan, 2024, 12:47

I just got a 500 error randomly haha, so random, here's the docker logs if they help: https://paste.techscode.com/uguzejikoveluju.apache

fafa
26 Jan, 2024, 12:47
fafa
26 Jan, 2024, 12:47

chunkSize

D5
26 Jan, 2024, 12:47

Oh, I see

D5
26 Jan, 2024, 12:48

Uh, looks like the only error is docker error eof?

fafa
26 Jan, 2024, 12:49

yeah

fafa
26 Jan, 2024, 12:49

appwrite says timed out (which it did), but probably is same issue as before

Meldiron
26 Jan, 2024, 12:49

This specific EOF is fine, it happens when we try to calculate stats while there is cold-start ongoing. Its safe failure, it will re-calculate later. Also those stats are only used by OpenRuntimes Proxy, so if you dont have that in the stack, its not affecting anything

fafa
26 Jan, 2024, 12:49

alrighty!

fafa
26 Jan, 2024, 12:50

I'll keep chunkSize on 1, that seems to work with about a 5 second delay

fafa
26 Jan, 2024, 12:50

Would definitely love that there comes some sort of queuing system for this, instead of having to scale with cores, not everyone can pay for 60 cores in the end haha

Meldiron
26 Jan, 2024, 12:51

#ad, I would recommend Appwrite Cloud, we monitor those 24/7 and ensure we have enough resources to serve all requests 🙈

Meldiron
26 Jan, 2024, 12:51

Coming this year, likely. Coroutine-style http server

D5
26 Jan, 2024, 12:52

Isn't that possible with async?

fafa
26 Jan, 2024, 12:52

I know I know.. but the limits are not high enough :2HAhaa:

fafa
26 Jan, 2024, 12:53

and pricing for me is cheaper in storage using Cloudflare R2 (silently waiting for s3 generic adapter)

fafa
26 Jan, 2024, 12:54

Although honest question, if you do have an insane amount of people and are self-hosting, how would one do this? Just get one gigantic server with 2 cpu's?

Meldiron
26 Jan, 2024, 12:55

You can rent multiple servers and use orchestration manager such as Docker Swarm

fafa
26 Jan, 2024, 12:55

Ah, that would work

fafa
26 Jan, 2024, 12:56

Well, I got my issue fixed at least haha, thanks!

Reply

Reply to this thread by joining our Discord

Reply on Discord

Need support?

Join our Discord

Get community support by joining our Discord server.

Join Discord

Get premium support

Join Appwrite Pro and get email support from our team.

Learn more