I had a function that made 3 requests to the OpenAI API with 3 different prompts. This function took more than 30 seconds to respond and gave a timeout. So I divided it into 3 functions, each one making a request to the OpenAI API in parallel. However, they still took more than 30 seconds to respond.
@Steven suggested that they be run asynchronously, but this is creating a queue of executions. While one is processed, the other two are waiting.
Is it possible to execute a function asynchronously but without a queue being created?
No. Perhaps you need to horizontally scale out your function workers
You are saying that if i execute it sync, making 3 requests to 3 different functions, it runs in parallel. But making the same 3 requests with async option, it creates a queue, and the only way to avoid it is scaling out?
Recommended threads
- Authentication on custom Websocket Serve...
Hi, I want to use a custom Websocket Server (using Bun) for my application. However I cant really figure out authentication on custom servers. Session cookies ...
- Realtime: Listener not triggered on upda...
I self host appwrite 1.8.1. The genereal functionallity works fine. But my realtime subscription isn't updating. I see "Received heartbeat response from realtim...
- TablesDB can't be used in Appwrite Funct...
I have written a function (DART) and it won't deploy. Here is what I get : 2026-03-14T17:09:41.459693680Z Compiling ... 2026-03-14T17:09:42.915619217Z ../build...