• Instructions to Ask a Question

    For any assistance, please click the "Ask a Question" button and select the Pabbly product for which you require support.

    We offer seven comprehensive applications designed to help you efficiently manage and grow your business:

    Our support team endeavors to respond within 24 business hours (Monday to Friday, 10:00 AM to 6:00 PM IST). We appreciate your understanding and patience.

    🚀 Exclusive Lifetime Offers 🚀

    We invite you to take advantage of our special one-time payment plans, providing lifetime access to select applications:

    • 🔥 Pabbly Connect — Lifetime Access for $249View Offer
    • 🔥 Pabbly Subscription Billing — Lifetime Access for $249View Offer
    • 🔥 Pabbly Chatflow — Lifetime Access for $249View Offer

    Make a one-time investment and enjoy the advantages of robust business management tools for years to come.

Open AI Automation doesnt work

P

Pabblymember11

Guest
Hey @Dante8484

Temporarily, the Timeout window for OpenAI Generate Content has been increased to 40 seconds, although this is not the preferred solution. It is important to note that Zapier has a timeout limit of 30 seconds for OpenAI and other platforms are encountering similar issues.

We have informed the OpenAI team about the ideal solution for this, but have not received a response.
 

agenciaecom

Member
Dear Pabbly Support Team,

I am writing to express my frustration with the ongoing issue with Open AI Automation not working properly. Despite my best efforts to optimize the prompt and try different workflows, I keep receiving the error message "Response not received in 40 seconds" every time I attempt to use the platform.

I have seen that you have increased the Timeout window for OpenAI Generate Content to 40 seconds, but this is only a temporary solution. It is unacceptable that such a critical feature has a timeout limit of only 30 seconds for Zapier, and other platforms are encountering similar issues. I believe that it is your responsibility to ensure that your customers have access to reliable and functional automation tools, and this ongoing problem with Open AI Automation is preventing me from achieving my goals.

I urge you to take immediate action and find a permanent solution to this issue. I expect a prompt response and resolution to this problem as soon as possible.

Thank you for your attention to this matter.

Sincerely,
 
P

Pabblymember11

Guest
We sincerely regret any trouble this may have brought about. Our team is actively looking into the problem and trying to come up with a fix. As we learn more, we'll keep you informed and take the necessary steps to get the problem fixed as quickly as possible. I appreciate your understanding and patience.
 

bandelero

Member
We sincerely regret any trouble this may have brought about. Our team is actively looking into the problem and trying to come up with a fix. As we learn more, we'll keep you informed and take the necessary steps to get the problem fixed as quickly as possible. I appreciate your understanding and patience.
Folks, you need to increase the timeout to 300 seconds at least. This is the reality of working with OpenAI modules. Especially with the likes of GPT-4. They are super slow. We need to be able to build robust automation that don't error out because OpenAI is slow...

Evgeny
 
Agreed, im having the same issue with this. I was hoping to come here and find an answer.

Does anyone know if using the regular API connection solve this issue? (sending a webhook and waiting for the response that way?) Or is this something on OpenAi's end? (thats what it sounds like)
 

bandelero

Member
Pabbly IS using the real API connection. You can do the same via an HTTP request module, but you'll get the same results. OpenAI repeatedly runs out of capacity, so they send an error to the HTTP request.
 
We sincerely regret any trouble this may have brought about. Our team is actively looking into the problem and trying to come up with a fix. As we learn more, we'll keep you informed and take the necessary steps to get the problem fixed as quickly as possible. I appreciate your understanding and patience.
Is there any update on this yet?
 
P

Pabblymember11

Guest
Currently, it is in our roadmap and once we publish such a feature we will let you know.
 
Top