We need to get all vehicle listings on eBay Motors via the eBay Motors API. The following fields need to be collected for each posted vehicle:
Year, make, model, trim, mileage, VIN, title type, body type, exterior color, transmission, number of cylinders, fuel type, number of doors, drive type, number of views, number of watchers, ad number, price, location, date posted, date updated, listing title, url, and description (text only).
We already have an API key.
Needs to be written in Python 3. Needs to run inside a Docker container.
We need the script. The script needs to be able to be repeatedly run in a Docker container.
The script needs to be able to run continuously by itself, or to wait an interval (number of minutes) after it finishes before beginning again.
The script needs to be able to search by all of the search filters that the ebay API offers. They need to be optional (such that it will run without any filters).
The script needs to be able to accept all of the following as optional arguments:
- filters from eBay
- debug mode
- verbose mode
- interval (in minutes) to wait before restarting
It needs to feed the data into our existing MySQL 5.7 2nd generation InnoDB database on Google Cloud Platform with the following table structure (which will be done using the Google Cloud SQL Proxy):
Please see the attached file for the database table structure / DDL and some details about the fields.
The API method returns only 10,000 results at a time, in pages of 100 results each. The API limits us to 5000 calls / day.
The script will need to work with these constraints to get all new listings and all updated listings since it was last run every time.
The call to the API should filter the results to:
- location = US, Canada
- ListingType = AuctionWithBIN, Classified, and FixedPrice
The script should store all times in UTC.
Each time the script is run, it needs to get the data from all the listings that have been created or updated since the last time it ran (even if the last time it ran was days or weeks ago).
Each time the script finds a listing that already exists in the database, it needs to update all fields in the row in the database. It should also update the date_updated field each time.
All fields need to have commas removed from them before they enter the database.
Any time a field does not have a value or has 'not specified' as the value, it should be stored as NULL.
The location field should be stored as city state zip
The script needs to output everything to a log file for debugging.
The scrape should break gracefully on Ctrl + C.
Script needs to use "source = 'ebay'" for SQL statements since the table contains records from other sources.
Each time the script finishes running, it should output some status elements (such as how many items were created vs updated).
Please let me know which days you will be working on my project, and how many hours / day.
The code should be self documenting (good method names, variable names, structure, etc.).
All of your code needs to be well commented.
Your code needs to have good error handling.
Your code needs to follow good programming practices and standards.
We will manage all code and issues via GitHub. You will use the Projects feature of GitHub to show work you have not started, work in progress, and work completed.
I expect that you will complete it in the timeframe you quote. Otherwise, we will cancel the contract.
Please feel free to ask any questions you need!
Hi Paul , I am interested in your project related to build a scraper for ebay Motors API. Please send me a message so that we can discuss all the details. Thanks, Ramzi
HI, there! I have read your description carefully. I have high Python skill and can scrape data from the website. I can agree with your suggestion that I have to work via Github. Please contact me and discuss more.
Hi sir, i am experienced in scraping, i did many similar jobs which you can see in my reviews, i have developed some codes already so the speed will be high for your job, would you please share the details?
Hello. Do you want scrapping result as csv file? I have tons of experience with web scrapping result by using python. lets discuss in detail over chat. Best regards.