We want to store the prices of our competitors in our database to monitor them. The prices need to be crawled from websites (three pages in total with one being Amazon) and are also partly in CSV feeds. A SQL DB should be used for storage including a timestamp. Each of the updates need to be an own function that we can use in other Python web services.
Eventually, we want to deploy it in GCP and run the update a few times a day (out of scope). For the start, it is enough to have it locally.
As we require the project quiet urgently we are willing to pay a higher price for a rather simple task.
25 freelancers are bidding on average £28/hour for this job
Hi, I am the experienced web crawler, especially Amazon products. Also can scale crawling instances as demand. I am available for further technical discussion. Sincerely. Yuri Ren
Hi, We have done a similar job before for client based in the US. we can create a crawler that will get data from the websites according to your requirement. We can discuss this further waiting for a reply. Thank you
I am professional software developer, based in Kiev, Ukraine. My experience about 20 years. My website: [login to view URL] (you can contact me here). Good written English.