I need a web scraper written for the following url:
[login to view URL]
Login is required. Login credentials for Fox Lumber available loads:
All information needed is available on the main page. The number of rows will vary.
The data to scrape will be separated by a text box, new data will start at the green box with the text "View & Bid".
The output should be a pipe (|) delimited file with the following column mappings:
origin_city --> data located after the text "Origin:", before the first comma
origin_state --> data located after the text "Origin:", after the first comma
ship_date --> data located after the text "Pickup Date", changed to the YYYY-MM-DD format
destination_city --> data located after the text "Destination:", before the first comma
destination_state --> data located after the text "Destination:", after the first comman
receive_date --> data located after the text "Delivery Date:", changed to the YYYY-MM-DD format
trailer_type --> data located after the text "Equipment:"
load_size --> if the text partial is located after "Equipment:", add text "Partial" as the load_size
if not partial, add text "Full"
weight --> data located after the text "Weight:"
length --> Leave blank
width --> leave blank
height --> leave blank
trip_miles --> data located after the text "Miles:"
pay_rate --> data located after the text "Rate:"
contact_phone --> leave blank
contact_name --> leave blank
tarp_required --> leave blank
comment --> data located after the text "Comments:"; also the data "Quote:" plus the numbers following and the data
"PO Number:" plus the numbers following
load_number --> leave blank
commodity --> leave blank
The first line of the output should contain all of the column headers.
Any field that contain no data should be left blank.
Please do not use words like "null" or "blank" in blank columns.
Below is a sample output of the first 5 columns using sample data:
The deliverable will be a Perl .pl file that must run on
Ubuntu Linux and must use Modern::Perl. The Perl .pl file
should be called '[login to view URL]' and the output file should be
called '[login to view URL]'
It will be scheduled in cron to run unattended every 15 minutes.
Please specific what language/OS/modules you plan to use.
Also, please include the word "raccoon" in your bid so I know that
you read this description.
13 freelancers are bidding on average $136 for this job
hello i suggest for you a java application that you login to the website and scrape the data then store it to excel or csv file as you want. it would be a pleasure to work with you. thank you in advance.
Hi, I am staring as new in this sote. Pls ignore my rating here to measure my skills. I have own web Scraping engine [login to view URL] Its a matter of one day, I just keep extra a day to revise. thnx