Hey team,
Have data in Google sheets that I am pushing to a pabbly webhook using the 'send all data's option in the pabbly connect extension.
Once it is in pabbly, I need it to search an airtable table to locate existing records and update accordingly - this is all working fine, however I keep having failed workflows as the workflow is triggering airtable's API limit (5 API calls per second). I have added a 1 min delay within the workflow, thinking this would solve the issue, but I am assuming that each row in the Google sheet is being sent to pabbly at the same time and being processed across multiple workflows (eg 20 workflows running at the same time, not one workflow iterating thought the 20 rows), meaning the delay is just slowing each rows overall execution time, but still running multiple at once?
How am I best to go about slowing the overall processing of data in the workflow, or having the data all processed in the one workflow (maybe using an iterator)? Keen to hear your thoughts!
Have data in Google sheets that I am pushing to a pabbly webhook using the 'send all data's option in the pabbly connect extension.
Once it is in pabbly, I need it to search an airtable table to locate existing records and update accordingly - this is all working fine, however I keep having failed workflows as the workflow is triggering airtable's API limit (5 API calls per second). I have added a 1 min delay within the workflow, thinking this would solve the issue, but I am assuming that each row in the Google sheet is being sent to pabbly at the same time and being processed across multiple workflows (eg 20 workflows running at the same time, not one workflow iterating thought the 20 rows), meaning the delay is just slowing each rows overall execution time, but still running multiple at once?
How am I best to go about slowing the overall processing of data in the workflow, or having the data all processed in the one workflow (maybe using an iterator)? Keen to hear your thoughts!