I’d like to see a trigger for LinkedIn, mainly when a post is published by a particular account, to scrape the text, images, and videos of the post, and use it in the rest of the flow.
We’d want the post published on Linkedin, to be sent to an AI model to rewrite it and publish it onto a separate platform.
While a direct trigger feature doesn’t exist on LinkedIn, achieving your goal is absolutely possible with some alternative approaches. However, it’s important to consider the specific steps involved, which may require a bit more effort.
Remember: Since this is not directly supported by LinkedIn we need to use a polling mechanism, You cannot do this in realtime.
Here’s a flow that might work
https://www.linkedin.com/search/results/content and add all your filters (account, company page, etc) and sort by latest. Once done copy the URL
- Use a LinkedIn scraper to scrape LinkedIn posts(Ex:
https://apify.com/curious_coder/linkedin-post-search-scraper) and enter the copied URL as source for scraping
- Configure the scraper to run whenever called via WebHook (copy the WebHook url, we will need this)
Processing and Posting in ActivePress:
Note Database here refers to any platform that allows us to store/retrieve/filter records (ex: AirTable, AITable, etc)
- Create a new flow and set Schedule as Trigger (could be hourly, daily, etc)
- Make HTTP request to the Crawler URL
- Add custom code block to parse the response from Webhook call
- Create a new loop and check if the post already exists in Database
- If the post doesn’t exist, Create a new record and store post details + Store post data in storage. End the loop
- Get all the posts from Storage
- Add condition to check if posts exists in Storage
- Loop through all posts again
- Generate posts summary using OpenAi
- Add custom code block to Parse the response from OpenAi
- Post on Twitter or wherever you want => This is the step you’re interested in
- Mark the post as published in Database. End the loop
- Delete all temp data from Storage
I kept everything very generic, Based on the tools and integrations you use this can be simplified a lot.
If the crawler you use support third party integrations and scheduling, you can process the data in Crawler itself and save it in Database.
Later you can create a new Flow and set trigger as new record entry in Database and everything will be extremely simplified.
By the way, If you have a flow for posting on LinkedIn and you want the same posts summary to be posted on other platforms, it’s very very easy.
Here’s few screenshots on what the flow might look like.