• Instructions to Ask a Question

    For any assistance, please click the "Ask a Question" button and select the Pabbly product for which you require support.

    We offer seven comprehensive applications designed to help you efficiently manage and grow your business:

    Our support team endeavors to respond within 24 business hours (Monday to Friday, 10:00 AM to 6:00 PM IST). We appreciate your understanding and patience.

    🚀 Exclusive Lifetime Offers 🚀

    We invite you to take advantage of our special one-time payment plans, providing lifetime access to select applications:

    • 🔥 Pabbly Connect — Lifetime Access for $249View Offer
    • 🔥 Pabbly Subscription Billing — Lifetime Access for $249View Offer
    • 🔥 Pabbly Chatflow — Lifetime Access for $249View Offer

    Make a one-time investment and enjoy the advantages of robust business management tools for years to come.

VIDEO GENERATION DELAY RETRIEVAL

Your Task History ID
IjU3NjUwNTZiMDYzNjA0Mzc1MjZlNTUzNjUxM2E1MTYwNTQzOTBmMzYi_pc
Hey,

I am struggling with my delay action after sending custom api request to heygen, to generate a vudeo and then retrieve it. I wonder if I can milimalize the chances of mistake or error sand always retrive the video with delay. As of right now, it's super difficult for me to predict when a video from hegen will be ready so sometime my automation delay is too early so the video can not be retrieved so I have airtable automation pronblem bc the firlds is empty.

Can you guide me how to make it smooth, I was thinking about looping the action somehow to continue the workflow when video link is retrieved only...

Can you guide me what is the best way for me to optimalize the workflow.
 

Fagun Shah

Well-known member
Hey,

I am struggling with my delay action after sending custom api request to heygen, to generate a vudeo and then retrieve it. I wonder if I can milimalize the chances of mistake or error sand always retrive the video with delay. As of right now, it's super difficult for me to predict when a video from hegen will be ready so sometime my automation delay is too early so the video can not be retrieved so I have airtable automation pronblem bc the firlds is empty.

Can you guide me how to make it smooth, I was thinking about looping the action somehow to continue the workflow when video link is retrieved only...

Can you guide me what is the best way for me to optimalize the workflow.
Why not use webhooks? So when the video is ready it will trigger a webhook in pabbly connect. it will save you time and once video is ready it will immediately trigger webhook, so no time wasting.
 
Thanks @Fagun Shah, I just finished setting up the webook.

Maybe you can help me with another issue I’m having with the OpenAI custom request used to extract data from my web scrape.

The problem is pretty straightforward:
I’m running into a length limitation with the OpenAI API — is there a maximum input size per call?

I’ve been thinking of some workarounds. For example:

  • Formatting the text more efficiently
  • Splitting the content into smaller blocks
  • Stripping out unnecessary sections before sending the request
Do you think this approach makes sense? Or is there a better way to handle large scraped datasets with OpenAI?

HERE IS THE PICTURE FOR REFERENCE:
Screenshot 2025-08-21 at 3.38.30 PM.png
 

ArshilAhmad

Moderator
Staff member
The issue is that you have exceeded the maximum character limit allowed in a field in Pabbly Connect. This cannot be resolved using a Text Formatter, because passing the same data through it will result in the same issue. You need to ensure that the source application sends data within the character limit.
 

Fagun Shah

Well-known member
Thanks @Fagun Shah, I just finished setting up the webook.

Maybe you can help me with another issue I’m having with the OpenAI custom request used to extract data from my web scrape.

The problem is pretty straightforward:
I’m running into a length limitation with the OpenAI API — is there a maximum input size per call?

I’ve been thinking of some workarounds. For example:

  • Formatting the text more efficiently
  • Splitting the content into smaller blocks
  • Stripping out unnecessary sections before sending the request
Do you think this approach makes sense? Or is there a better way to handle large scraped datasets with OpenAI?

HERE IS THE PICTURE FOR REFERENCE: View attachment 59757
I need to check the request in detail. You can hire me by messaging me on FB messenger. The link is below.
 
Top