• Instructions to Ask a Question

    For any assistance, please click the "Ask a Question" button and select the Pabbly product for which you require support.

    We offer seven comprehensive applications designed to help you efficiently manage and grow your business:

    Our support team endeavors to respond within 24 business hours (Monday to Friday, 10:00 AM to 6:00 PM IST). We appreciate your understanding and patience.

    🚀 Exclusive Lifetime Offers 🚀

    We invite you to take advantage of our special one-time payment plans, providing lifetime access to select applications:

    • 🔥 Pabbly Connect — Lifetime Access for $249View Offer
    • 🔥 Pabbly Subscription Billing — Lifetime Access for $249View Offer
    • 🔥 Pabbly Chatflow — Lifetime Access for $249View Offer

    Make a one-time investment and enjoy the advantages of robust business management tools for years to come.

OpenAI - Response not received in 25 seconds.

P

Pabblymember11

Guest
I understand your concern with the OpenAI integration on Pabbly Connect and feel that it is unlikely to work. I apologize for any inconvenience that this has caused you.

However, I want to assure you that the majority of our users are successfully using the OpenAI integration without any issues. We continuously work to improve our integrations to ensure that they work seamlessly for all users, and we are committed to resolving any problems that you may be experiencing.

I would be more than happy to assist you in troubleshooting any issues you are experiencing with the OpenAI integration.
 

bandelero

Member
Dear Pabbly folks,

Realistically, you'd need to increase your OpenAI timeout to 300 seconds. Their models routinely get overloaded, and they are slow. It often takes up to 5 minutes to output a long text. This doesn't mean there's an error on their side; they are just slow. OpenAI modules will likely be some of the most requested within Pabbly. And I know the timeout limit is much lower for most other APIs. But this is a new reality. Having OpenAI error out because you're timing the module out after 25 seconds will cause A LOT of user frustration. Hence, for OpenAI, I'd jack up the timeout to 300 seconds (at least).

Also, we desperately need auto-retry conditions. This could be useful for other modules, but for OpenAI it's essential. Their endpoints error out routinely when their models are over-run. And having to stop and re-enable the entire scenario because their modules error out is a huge waste of operations. Hence, we need to build robust automation scenarios with OpenAI where we can set something like "auto-retry three times in case of errors".

Eugene
 
P

Pabblymember11

Guest
Hey @bandelero

Thank you for your feedback regarding the OpenAI timeout. We understand the potential case caused by longer response times from OpenAI's models. Unfortunately, we are unable to increase the timeout limit at this time for such instances. We apologize for any inconvenience this may cause. We appreciate your understanding and will continue to work on optimizing our platform within the existing constraints.

Further, we have forwarded your request for the functionality of retrying the execution of the failed task in the system.
 
P

Pabblymember11

Guest
The timeout limit is influenced by technical considerations and system performance. Making changes to the timeout duration would require significant architectural modifications and may impact the overall resource of our platform.

While we understand the potential benefits of a longer timeout, we have carefully balanced responsiveness and system reliability in our current configuration whereas 40 seconds timeout is a larger processing time. Factors such as network conditions and external API performance can also affect the execution time of API requests.

We appreciate your understanding as we continue to improve our platform within the existing limitations.
 
Top