I had a function that made 3 requests to the OpenAI API with 3 different prompts. This function took more than 30 seconds to respond and gave a timeout. So I divided it into 3 functions, each one making a request to the OpenAI API in parallel. However, they still took more than 30 seconds to respond.
@Steven suggested that they be run asynchronously, but this is creating a queue of executions. While one is processed, the other two are waiting.
Is it possible to execute a function asynchronously but without a queue being created?
No. Perhaps you need to horizontally scale out your function workers
You are saying that if i execute it sync, making 3 requests to 3 different functions, it runs in parallel. But making the same 3 requests with async option, it creates a queue, and the only way to avoid it is scaling out?
Recommended threads
- functions domain error
I cannot set up the domain for function I’m trying to add the domain api.example.com I can only use Cloudflare as the DNS provider because my domain provider do...
- Storage Bucket Permissions
Hey folks, when enabling CRUD on the bucket level for the role any, should the bucket be accessible when using a session client?
- Python function - error while creating /...
I have been trying to figure it out myself for the last 2 days. I have self-hosted appwrite instance, and I am running python 3.12 function. It works great up t...