• Instructions to Ask a Question

    Click on the "Ask a Question" button and select the application for which you would like to ask questions.

    We have 5 different products namely - Pabbly Connect, Pabbly Subscription Billing, Pabbly Email Marketing, Pabbly Form Builder, Pabbly Email Verification.

    The turnaround time is 24 hrs (Business Hours - 10.00 AM to 6.00 PM IST, Except Saturday and Sunday). So your kind patience will be highly appreciated!

    🚀🚀Exclusive Discount Offer

    Just in case you're looking for any ongoing offers on Pabbly, you can check the one-time offers listed below. You just need to pay once and use the application forever -
     

    🔥 Pabbly Connect One Time Plan for $249 (🏆Lifetime Access) -  View offer 

    🔥 Pabbly Subscription Billing One Time Plan for $249 (🏆Lifetime Access) - View offer

Open AI Automation doesnt work

P

Pabblymember11

Guest
Hey @Dante8484

Temporarily, the Timeout window for OpenAI Generate Content has been increased to 40 seconds, although this is not the preferred solution. It is important to note that Zapier has a timeout limit of 30 seconds for OpenAI and other platforms are encountering similar issues.

We have informed the OpenAI team about the ideal solution for this, but have not received a response.
 

agenciaecom

Member
Dear Pabbly Support Team,

I am writing to express my frustration with the ongoing issue with Open AI Automation not working properly. Despite my best efforts to optimize the prompt and try different workflows, I keep receiving the error message "Response not received in 40 seconds" every time I attempt to use the platform.

I have seen that you have increased the Timeout window for OpenAI Generate Content to 40 seconds, but this is only a temporary solution. It is unacceptable that such a critical feature has a timeout limit of only 30 seconds for Zapier, and other platforms are encountering similar issues. I believe that it is your responsibility to ensure that your customers have access to reliable and functional automation tools, and this ongoing problem with Open AI Automation is preventing me from achieving my goals.

I urge you to take immediate action and find a permanent solution to this issue. I expect a prompt response and resolution to this problem as soon as possible.

Thank you for your attention to this matter.

Sincerely,
 
P

Pabblymember11

Guest
We sincerely regret any trouble this may have brought about. Our team is actively looking into the problem and trying to come up with a fix. As we learn more, we'll keep you informed and take the necessary steps to get the problem fixed as quickly as possible. I appreciate your understanding and patience.
 

bandelero

Member
We sincerely regret any trouble this may have brought about. Our team is actively looking into the problem and trying to come up with a fix. As we learn more, we'll keep you informed and take the necessary steps to get the problem fixed as quickly as possible. I appreciate your understanding and patience.
Folks, you need to increase the timeout to 300 seconds at least. This is the reality of working with OpenAI modules. Especially with the likes of GPT-4. They are super slow. We need to be able to build robust automation that don't error out because OpenAI is slow...

Evgeny
 
Agreed, im having the same issue with this. I was hoping to come here and find an answer.

Does anyone know if using the regular API connection solve this issue? (sending a webhook and waiting for the response that way?) Or is this something on OpenAi's end? (thats what it sounds like)
 

bandelero

Member
Pabbly IS using the real API connection. You can do the same via an HTTP request module, but you'll get the same results. OpenAI repeatedly runs out of capacity, so they send an error to the HTTP request.
 
We sincerely regret any trouble this may have brought about. Our team is actively looking into the problem and trying to come up with a fix. As we learn more, we'll keep you informed and take the necessary steps to get the problem fixed as quickly as possible. I appreciate your understanding and patience.
Is there any update on this yet?
 
P

Pabblymember11

Guest
Currently, it is in our roadmap and once we publish such a feature we will let you know.
 
Top