• Instructions to Ask a Question

    Click on the "Ask a Question" button and select the application for which you would like to ask questions.

    We have 5 different products namely - Pabbly Connect, Pabbly Subscription Billing, Pabbly Email Marketing, Pabbly Form Builder, Pabbly Email Verification.

    The turnaround time is 24 hrs (Business Hours - 10.00 AM to 6.00 PM IST, Except Saturday and Sunday). So your kind patience will be highly appreciated!

    🚀🚀Exclusive Discount Offer

    Just in case you're looking for any ongoing offers on Pabbly, you can check the one-time offers listed below. You just need to pay once and use the application forever -
     

    🔥 Pabbly Connect One Time Plan for $249 (🏆Lifetime Access) -  View offer 

    🔥 Pabbly Subscription Billing One Time Plan for $249 (🏆Lifetime Access) - View offer

Webhook with a lot of data to multiple Google Sheets rows

Jannick

Member
Hi!

I'm using Hexomatic in combination with Google Sheets. In Hexomatic I've setup a daily scraper that has to export the data to a Google Sheet.

I noticed Hexomatic isn't very reliable in terms of exporting to Google Sheets, and their team is working on it. In the meantime it's possible to export the data using a webhook. Pabbly succesfully retrieves all of the data, but I don't know how to make multiple Google Sheets from all of the data from the Webhook.

The webhook contains data for 50 rows and per row 5 colums of data. So there are a minimum of 250 data points form the webhook.

Does anyone know how to simple create 50 rows with 5 colums per row with the data from the webhook?

Thanks!
 

Fagun Shah

Well-known member
You need to create a Google Sheet with all 5 column headers and then capture the webhook data with Simple Response toggle turned off under webhook URL in pabbly workflow.

Which will get you the data in array format.

Then use iterator to add each row to gsheet.

More on iterator here-
 

Jannick

Member
Great! I mostly got it to work, sadly every row takes up one task.

Is it possible to filter the array data from the webhook, to only iterate rows after that have a certain date or later? For example only add rows to Google Sheets where date is 24 october 2021 of later.

I also get this error when adding rows to Google Sheets:
Schermafbeelding 2021-10-25 om 16.24.55.png


Thanks!
 

Fagun Shah

Well-known member
Great! I mostly got it to work, sadly every row takes up one task.

Is it possible to filter the array data from the webhook, to only iterate rows after that have a certain date or later? For example only add rows to Google Sheets where date is 24 october 2021 of later.

I also get this error when adding rows to Google Sheets:
View attachment 4820

Thanks!
For error tagging Pabbly Team members - @Supreme Verma @Pabbly

For filter just use greater then filter step after the iterator.
 
Hi!

I'm using Hexomatic in combination with Google Sheets. In Hexomatic I've setup a daily scraper that has to export the data to a Google Sheet.

I noticed Hexomatic isn't very reliable in terms of exporting to Google Sheets, and their team is working on it. In the meantime it's possible to export the data using a webhook. Pabbly succesfully retrieves all of the data, but I don't know how to make multiple Google Sheets from all of the data from the Webhook.

The webhook contains data for 50 rows and per row 5 colums of data. So there are a minimum of 250 data points form the webhook.

Does anyone know how to simple create 50 rows with 5 colums per row with the data from the webhook?

Thanks!
No... Pabbly doesn't successfully retrieves all of the data.. I have upgraded my account for 1 month, and the system still asks me to upgrade.. Its even cant able to identify my recent payment.. and main thing is.. Its not retrieving all the data from google sheets... There is an issue with this system....@fagun shah
 
It fetches only 1st row of the google sheets.. This action takes place while i snap the test button or press re-capture webhook responses... why this has not doing the operation automatically? Your prompt response is highly appreciated!
 
Top