Codeblock Tutorial on how to get Content from URL

I found a few posts around this topic like this one:

And this one:

But I and I feel like a lot of others would get a ton of value for a short (ideally video)-step by step tutorial on how to filter the usable content out from a http request.

This will allow anyone to make all kinds of research focused automations. Automatically rewriting content and posting it on their social media becomes possible. Be it from an rss feed or from a list of links where each one gets summarized and then there comes out a blogpost with 5 sources.
All becomes possible, if we just get that one piece sorted.

Hi Momo,

Have you tried www.apify.com? it has a very good free tier which allows you to scrape a lof of content from many sites, It is also very easy to integrate into AP.

Also you are able to scrape leads from www.apollo.io which works quiet well and there is a video on YT about it. And they have scrapers that work with ChatGPT too.

I am also in touch with freelancers for custom builds, on APIfy. it’s not expensive

KR

Bram

1 Like

Hi Bram,
Thank you. And for Apify I only need a webhook to get the content from a blog for instance.
That is great. I will aim to figure this out somehow. If you got the link to the Video I would be grateful.

All the best,
M

Do you have any tutorial on how to use apify with AP? @Bram , BTW do you build flows for people?

Hi Momo,

Yes that is correct and with a HTTP Request you can start the Actor. Let me try to create a static example on a custom Actor, but they all work the same way.

Receiving Data via a webhook and how to get the datasets

Go to your actor and go to Integrations:

Hit this blue button
image

Select HTTPS Webhook

Press “Configure”

Now create a new flow in Active Pieces starting with a webhook and Copy the webhook URL

Set the event type to Run suceeded and Paste the webhook into the URL

Now go back to active pieces and you webhook and press “send data” with me it is “Gegevens verzenden”

Head back to APIfy and scroll down your actor and press save an test, Your webhook should receive something like this

Next add a new HTTP step to your flow and make it GET

Triggering your Actor via Active Pieces

Select: API → API Clients

The click on cURL ans SCROLL DOWN!!!

Use the URL, POST type and -h (header) and set them into Active Pieces.

The set-up in this case should look like this in Active Pieces

Please remove the “$API_TOKEN” from the URL and replace it with your token, found here, when you SCROLL UP in APIfy API Client:

Then you need to copy the Body from your API Client, in this case the green text starting from { and ending with }

{
“searchStringsArray”: [
“restaurant”
],
“locationQuery”: “New York, USA”,
“maxCrawledPlacesPerSearch”: 50,
“language”: “en”,
“onlyDataFromSearchPage”: false,
“includeWebResults”: false,
“scrapeDirectories”: false,
“deeperCityScrape”: false,
“reviewsSort”: “newest”,
“reviewsFilterString”: “”,
“scrapeReviewerName”: true,
“scrapeReviewerId”: true,
“scrapeReviewerUrl”: true,
“scrapeReviewId”: true,
“scrapeReviewUrl”: true,
“scrapeResponseFromOwnerText”: true,
“searchMatching”: “all”,
“placeMinimumStars”: “”,
“skipClosedPlaces”: false,
“allPlacesNoSearchAction”: “”
}

Should look like this in AP:

Then you can try it by testing the step.

This should be the result:

Then in APIfy you can see the Actor running:

To collect the scraped data in a webhook there are a few options available, I assume you will run this actor more often, and if not it doesn’t matter but this is the way to save more results.

1 Like

Hi @Momo ,

Sorry for the late reply I had the above post as a draft ready to be posted…

Hope this helps, @Preben

I currently don’t build flows for people, I also think my hourly rate will be a bit to expensive. You can try the marketplace option here. Or dm me if you are interested