There are 3 projects, which are complementary but independent (but the project 3 will be carried out if projects 1 and 2 are successful)
This involves working on a job offer flow: either to create another flow, or to integrate the flow into an excel spreadsheet.
PROJECT 1 :
- Create a new TARGET RSS feed based on one (or 2) existing SOURCE RSS feeds, but filtered on the exact criteria requested (criteria to be communicated). The SOURCE RSS feed (a series of job postings constantly updated in real time)
- Format the HTML description part of the TARGET RSS feed (using the formatting of a SOURCE RSS feed) to enable the customer to send a formatted e-mail every day (this part of the e-mail is just for IT purposes to explain the need) via Microsoft's powerautomate.com tool (the customer's environment is Microsoft).
- Development to provide a URL like: https://domaine.fr/fa/rss
- Back and forth for correction/validation
- Start-up, hosting on the customer's server corresponding to his technical environment (question to ask yourself, for example, at the start of the project: which server do I need to store the development? and can it run on the customer's environment?)
See specifications in attach files : Specifications_Feed 1_En_vFinale.pdf
PROJECT 2 :
- Scraping of email addresses (+ employer field) contained in the following SOURCE RSS feed : https://www.emploi-territorial.fr/rss?
(this SOURCE RSS feed is a series of job advertisement publications constantly updated in real time).
This scraping is used to insert the e-mail addresses into a Microsoft Excel spreadsheet stored on the customer's Sharepoint.
- Technical constraints: automatically update a Microsoft Excel spreadsheet online in the Microsoft environment from a URL pointing to the file, using the customer's SharePoint API keys to write to the target file.
- Take into account the following processing after scraping in Excel :
- Structure the file in Tab 1 with 4 columns: e-mail (column A), e-mail identifier before @ (column B), e-mail domain with @ in front (column C), employer field information in the feed (column D).
- Cleans up duplicates in column A with the list of emails created each time the excel file is completed.
- Update frequency: every 6 hours (starting at 8am - Paris time - for the first cycle, then every 6 hours). Cumulate information already in the file with new information (cumulative update)
- A tab 2 contains in column A a manually updated dynamic list of e-mail domains (with the @ in front): delete in tab 1 all lines whose column C corresponds to an e-mail domain in column A of tab 2.
- Correction/validation back and forth
- Start-up, hosting on the customer's server corresponding to his technical environment (question to ask yourself, for example, at the start of the project: which server do I need to store the development? so that it can run on the customer's environment?)
See specifications in attach files : Specifications_Feed 2_En_vFinale.pdf
PROJECT 3 :
- Scraping of 4 data items contained in one (or 2) existing RSS SOURCE feeds, but filtered according to the exact criteria requested (exclusion criteria to be communicated by the customer: criteria identical to the "Project 1" requirements described above).
(this RSS SOURCE feed is a series of job postings constantly updated in real time). This scraping is used to insert the data into a Microsoft Excel spreadsheet stored on the customer's Sharepoint.
- Technical constraints: automatically update a Microsoft Excel spreadsheet online in the Microsoft environment from a URL pointing to the file, using the customer's SharePoint API keys to write to the target file.
- Take into account the following processing after scraping in Excel :
- Structure the file with 13 columns: publication date (column A), employer name (column E), position name (column F), e-mail (column G), position URL (column H) - the other empty columns (at the head of the file in line 1, not to be touched).
- Update frequency: automatic every 24 hours (from 8 a.m. Paris time). Cumulate information already in the spreadsheet with new information (cumulative update)
- Correction/validation back and forth
- Start-up, hosting on the customer's server corresponding to his technical environment (question to ask yourself, for example, at the start of the project: which server is needed to store the development? and can it run on the customer's environment?)
See specifications in attach files : Specifications_Feed 3_En_vFinale.pdf
We'll take the time to explain and answer your questions!
Thank you
Hourly Range: $10.00-$40.00
Posted On: August 12, 2024 18:12 UTC Category: Data Extraction Skills:Data Scraping, Data Extraction, API Integration, Microsoft Excel, RSS, Developmental Editing, Automation
Skills: Data Scraping, Data Extraction, API Integration, Microsoft Excel, RSS, Developmental Editing, Automation Country: FRA
click to apply
Project ID:
3432682
Project category:
Data Scraping, Data Extraction, API Integration, Microsoft Excel, RSS, Developmental Editing, Automation
3D House Model in SketchUp Category: 3D Modelling, 3D Rendering, 3ds Max, Building Architecture, SketchUp Budget: $10 - $30 USD
19-May-2025 21:50 GMT
WeChat Account Setup Category: Internet Marketing, Mobile App Development, Simplified Chinese Translator, Traditional Chinese (Hong Kong) Budget: ₹100 - ₹400 INR