What's your scraping use case?

Wow…this is pretty impressive Bram!

Do you have an estimate on how long it took you to set this all up?

Tim C Martin

Thanks Tim!

Mmm I did this as a side job in between other jobs, it took me 3 days to complete it to this level.

Want to build things too like this? Maybe good to DM me after if it does not have directly to do with scraping.

KR Bram

1 Like

I own a company and each month my accountant asks me for Amazon’s invoices.
And sometimes the accountant asks me about some spendings that i do not recognize from Amazon. this happens when Amazon splits an order in several expeditions. they send items at different time (ie):

  • one in June
  • one in july

but it does not correspond to the orders amount.

so I have to go manually on each order with several items and check wether or not it has been split. and grab the new prices.
take back the invoices and send to the accountant.

So I will have to scrape info:

  • I gather all order info with Oder number in a Googlesheet
  • Then I make a concatenation with Url + order number
  • then I pass the order url in order to scrape the 2 payement methods (based on expeditions) at the bottom of the document

@ashrafsam,

Is there any update on the scraping/monitoring website features?

3 Likes

Hi @Bram,

Thank you so much! I have recreated this flow and it works amazing!

But when I want to scrape a lot of leads 20 in this run example I get an ERROR have you found a way around it?

After 15 of the 20 records it broke.

Run exceeded 600 seconds, try to optimize your steps.

i am struggling with this too, i have to run 50-75 lead a day also running into this issue. What i’m planning to do is to creat a sheet in google where i can see which lead has already been ran. with like a 0-1 parameter.

Then i’m going to run the script like 3-4 time a day and only let it run the items that are still on 0 and those who’ve run will be set to 1. in this case you will always make sure you run each item.

Hope this helps.

I would like to be able to scrape data from a car dealership site inventory - make, model, image URL, price, details, year, etc, etc. and the have the create an XML file on a regular basis - twice a day maybe…

1 Like

@ashrafsam @abuaboud Any updates? We are all waiting for our scraping use cases. @Bram Apify seems like a beautiful software, thank you for sharing. The sad thing about it is that it is quite expensive, for the better scraping tools you easily pay $30/month and for the bigger scraping cases it would cost a lot of money. And that’s the beautiful thing about ActivePieces, it can be run localy so you can keep the cost close to $0 and create beautiful automations and if AP would expand this we could build scrapers on top of the platform and the possibilities would really be endless and it would be far more capable than Zapier.

These 3 practical use cases are great examples:

If I wanted to do these 3 automations, it would cost me $125 per month + the cost of Apify. This is not a scalable solution at all for most people on here.

2 Likes

Hi @Rabbit9,

Thanks for sharing! I understand your concerns about the cost of Apify: Full-stack web scraping and data extraction platform . The cost can go up fast, but still I think pricing is cheap.

Not saying that AP should not do anything on scrapers. But the three examples you showed + the first paid tier on Apify would cost you around 175 dollars a month. Which sounds like a lot of money, which is the case if you don’t get much in return. I do think your use case is important here too, what do you expect to make with these cost? 1000 dollars? 10,000 dollars? It all comes to your own business case.

Besides that I would love to see a solution here, keeping in mind the current prices on Apify and other platforms. AP would need to create a USP like a better price, more quality, be faster?.. I think it up to them to see if they can make that happen. I do think this could help them stand out in all the platforms around them.

Just saying there is much to think about before making this step and there are plenty of options available at the moment. So far I am happy with Apify but I am open to switching when solutions are better, which can be many things.

Kr Bram

@Rabbit9,

I actually agree with you. Apify is indeed not the most expensive tool out there. However, if AP would go into scraping, the possibilities for automation would expand significantly without very high monthly fees.

@ashrafsam, do you have an update on what you are currently working on?

1 Like

RTILA Studio web scrapper.

Scrape the website with RTILA and get a notification on activepieces.