The ultimate guide to hiring a web developer in 2021
If you want to stay competitive in 2021, you need a high quality website. Learn how to hire the best possible web developer for your business fast.
Web Scraping is the process of extracting data or information from an online source such as a website, database, application, etc. Web Scraping Specialists have the skill that helps people collect valuable digital data and quickly find the useful information they need from websites, mobile apps, and APIs. The experts usually use web scraping tools and advanced technologies to collect large amounts of targeted data without any manual work for the client.
With web scraping, tasks that otherwise may require a lot of time can be automated and done faster. Our experienced Web Scraping Specialists use their expertise to develop scripts that continuously target structured and unstructured data sources.
Here's some projects that our expert Web Scraping Specialist made real:
Web Scraping Specialists are skilled professionals who know how to help businesses optimize processes while collecting rich structured data they need for their specific purposes. Our experts fasten the process and return accurate results in less time, so that the customer can make better decisions more quickly without any manual labour. If you are looking for a talented professional to make a web scraping project for you, you have come to the right place. Here in Freelancer.com you can find talented professionals who will get the job done with top quality results! Post your project now and see what our Web Scraping professionals can do for you!
From 359,214 reviews, clients rate our Web Scraping Specialists 4.9 out of 5 stars.Web Scraping is the process of extracting data or information from an online source such as a website, database, application, etc. Web Scraping Specialists have the skill that helps people collect valuable digital data and quickly find the useful information they need from websites, mobile apps, and APIs. The experts usually use web scraping tools and advanced technologies to collect large amounts of targeted data without any manual work for the client.
With web scraping, tasks that otherwise may require a lot of time can be automated and done faster. Our experienced Web Scraping Specialists use their expertise to develop scripts that continuously target structured and unstructured data sources.
Here's some projects that our expert Web Scraping Specialist made real:
Web Scraping Specialists are skilled professionals who know how to help businesses optimize processes while collecting rich structured data they need for their specific purposes. Our experts fasten the process and return accurate results in less time, so that the customer can make better decisions more quickly without any manual labour. If you are looking for a talented professional to make a web scraping project for you, you have come to the right place. Here in Freelancer.com you can find talented professionals who will get the job done with top quality results! Post your project now and see what our Web Scraping professionals can do for you!
From 359,214 reviews, clients rate our Web Scraping Specialists 4.9 out of 5 stars.I have a batch of consistently-formatted PDFs containing alphanumeric information that needs to live in Google Sheets. Every PDF follows the same layout, so once the first few rows are mapped the rest should flow smoothly. You will receive the full set of digital files plus a Sheet template with the exact column order I expect. Your job is to extract each field from every PDF and enter it into the corresponding cell, preserving all characters exactly as they appear (including spaces, dashes, or any other symbols that are part of the data). Deliverable: • A Google Sheet, fully populated and spot-checked for accuracy, ready for me to share with my team. If you prefer using automated tools—Python, App Script, or specialized PDF extraction software—that’s fine as long...
I have a set of websites from which I need all relevant textual content pulled and organised neatly in an Excel workbook. The task is straightforward: visit each URL I provide, extract the required text sections, and enter them into clearly labelled columns so the file is ready for analysis the moment I open it. Accuracy and consistency matter more to me than speed. I’ll supply the list of sites, any field-by-field instructions, and a template sheet so you know exactly where each piece of text belongs. When you finish, I expect: • A clean, fully populated Excel file that matches my template • No missing entries or formatting issues If you automate the process with scripts or scraping tools, that’s fine—as long as the final spreadsheet is complete and human-...
I’m building an internal “SmartDoc” assistant that can digest practically any content I hand it—full PDFs, single HTML files, live web pages, images, even video transcripts—and then answer questions with proof in sight. Here’s what I need you to put together: • Ingestion & storage – Extract text from all supported formats. – Create embeddings and store them in more than one vector database; design it so I can toggle between, say, Chroma, Pinecone, Milvus, or any other store with minimal code changes. • Query workflow 1. Retrieve the most relevant chunks from every connected vector DB. 2. Hand those chunks to whichever LLM the user chooses. By default the call should hit my self-hosted model on a GPU server, ...
Please gather verified contact information from vendors in open-air markets through direct, door-to-door outreach rather than scraping online directories, business websites, or social media. The data set must include for each business: • Phone number • Email address • Physical (stall or shop) address Record everything in a clean Excel or Google Sheet, one row per vendor, with separate columns for Business Name (if available), Phone, Email, and Address. Accuracy is essential, so double-check spellings and digits on site before you log each entry. Once complete, share the spreadsheet and any field notes you kept while collecting the information so I can cross-reference if needed.
bjetivo: chatbot web con GPT (OpenAI) para atender leads, seguir un Prompt Maestro editable y capturar datos. Must-have: • Widget web (burbuja) + modo embebido opcional • Panel admin: • campo “Prompt Maestro” • slider de tono (0–100) • mensajes: saludo, re-saludo por inactividad, cierre • Captura de datos con validaciones: • nombre, ciudad/estado, WhatsApp (10 dígitos), tipo de proyecto, tiempo de arranque • Lógica anti-repetición (no volver a pedir nombre/ciudad si ya existe) • Re-saludo si no responde (ej. 2 min y 10 min, máximo 2 intentos) • Historial + export a Google Sheets/CRM • Seguridad: • no exponer la API key en el front • rate limit + logs Nice-to-have: ...
I have a straightforward online research job that centres on gathering market-level information about product demand and supply. I will specify the product category, geographic focus, and any keywords you should track; from there you’ll search reliable sources—industry portals, news sites, marketplace listings, and government or trade reports—to capture the latest numbers, observations, and any notable gaps between demand and availability. What I need from you: • A concise write-up (1–2 pages) that explains current demand, available supply, and any clear opportunities or saturation points • The key data points or quotes that back up each finding, cited with live links so I can verify quickly • A short spreadsheet of raw figures or headlines you c...
I have a straightforward online research job that centres on gathering market-level information about product demand and supply. I will specify the product category, geographic focus, and any keywords you should track; from there you’ll search reliable sources—industry portals, news sites, marketplace listings, and government or trade reports—to capture the latest numbers, observations, and any notable gaps between demand and availability. What I need from you: • A concise write-up (1–2 pages) that explains current demand, available supply, and any clear opportunities or saturation points • The key data points or quotes that back up each finding, cited with live links so I can verify quickly • A short spreadsheet of raw figures or headlines you c...
Contest: Find Discounted Roblox Credits Gift Card (IDR) – Minimum 15% Off Hello Freelancers, I am looking for legitimate websites, marketplaces, or proven methods to purchase Roblox Credits Gift Cards in Indonesian Rupiah (IDR) with a minimum discount of 15% from the face value. Example Denomination: IDR 100,000 Purchase price: around IDR 85,000 Scope of the Task Participants are expected to find and explain one or more of the following: Trusted websites (local or international) that sell Roblox Gift Cards IDR at a discounted price Marketplaces or digital platforms that regularly offer promotions or price cuts Legal reseller programs, bulk purchase options, loyalty programs, or other safe and legitimate methods Ways to obtain consistent discounts, not one-time or expired pro...
Please Sign Up or Login to see details.
I’m building a highly-targeted prospect list and need verified contact details for CEOs of companies that (1) appear on DesignRush under the Software Development category and (2) publicly market AI development services. Please confine the search to firms headquartered in North America. You’re free to use any combination of DesignRush filters, LinkedIn, Hunter, Apollo, or similar enrichment tools—as long as every email is deliverable and each company clearly positions itself as an AI developer. Deliverables • Spreadsheet (Excel or Google Sheets) containing: company name, website link, city / state, CEO full name, CEO email, LinkedIn profile URL, and a quick one-line note showing the company’s AI focus. • Minimum number of leads: propose what’s r...
I will supply about 50 simple search strings built from two variables—sector and location (for example, “builders London”). For each term, I need a clean, deduplicated list of all matching website addresses you can identify, aiming for at least 1,000 unique URLs in total. Feel free to combine your own AI-powered tools, data-scraping software, or manual research; I care only that the final list is accurate and complete. Please separate the sector, location, and discovered URL into their own columns, and ensure there are no duplicates across batches. I am flexible on delivery format—Excel, Google Sheets, or CSV all work—so pick whatever lets you move fastest. Before you begin the full run, send a short sample so we can confirm you are on the right track. Bid ...
Contact selection criteria: HoReCa sector (Hotels, Restaurants, Catering) + Food industry companies Area: Japan Scope of data in the database: Company name 100% Email 100% Category/Industry 100% Address 60% Telephone number 50% Additional requirements: - Deduplication required against provided mailing list - Delivery format: CSV or Excel
I have a stack of completed forms and survey sheets that now need to be keyed in by hand. Every response on each page must be transferred exactly as written into the template I’ll supply, preserving spelling, capitalization, and any skipped answers. Here’s what the job looks like from my side: I send you the forms in batches, you enter the data in the matching fields of the spreadsheet, double-check for typing mistakes, then return the file so I can run a quick spot-check before the next batch goes out. No automated scraping or OCR—this is strictly manual entry. Deliverable • A clean, fully populated Excel or Google Sheet that mirrors the original order of the forms, ready for immediate analysis. Acceptance criteria • 100 % of fields transcribed. •...
I Need an Advanced Automation Developer – Redirect-Based Booking Bot (Goethe, Wicket Apache, COE Session Handling) For Goethe Booking like Chennai and Bangalore I am looking for a high-level automation/bot developer who has experience with: Wicket/Apache-based web applications Multi-step redirect chains COE session initialization & dynamic token handling ColdFusion (CFID/CFTOKEN) & JSESSIONID flows Cookie extraction & accurate replay of Set-Cookie across redirects Chrome CDP automation / Playwright / Puppeteer High-speed DOM watching (DOMWatcher / MutationObserver) Proxy rotation & session isolation --- Project Goal Build a bot that can open the “Select Modules / Book” page reliably during Goethe exam seat drops, even when: normal browser ...
I have a ready-made list of product names. What I need is an end-to-end pipeline that, on every run, will: • Open a Google-only search for each product name • Capture the first eight organic result URLs in ranking order • Visit those pages, and pass it to Gemini or the ChatGPT API so the model can reliably extract price, currency, brand, description and any stated delivery time • Write the collected data to both CSV and JSON, saving fresh files after every run and tagging them with a timestamp • Survive the real-world web: captchas, bot checks, rate limits, timeouts, pages that hold multiple variants, or sites that omit a field entirely all need graceful handling and clear logging. Please package the code so I can spin it up on my own machine or a small cloud insta...
I have a ready-made list of product names. What I need is an end-to-end pipeline that, on every run, will: • Open a Google-only search for each product name • Capture the first eight organic result URLs in ranking order • Visit those pages, and pass it to Gemini or the ChatGPT API so the model can reliably extract price, currency, brand, description and any stated delivery time • Write the collected data to both CSV and JSON, saving fresh files after every run and tagging them with a timestamp • Survive the real-world web: captchas, bot checks, rate limits, timeouts, pages that hold multiple variants, or sites that omit a field entirely all need graceful handling and clear logging. Please package the code so I can spin it up on my own machine or a small cloud insta...
I need a browser extension that can inspect specific pages, pull out the text, images and any data the user enters into forms, then immediately push that payload to the user’s account on our existing app. The same codebase should work in Chrome (Manifest V3) and Safari. Core flow • When a user visits one of the target domains, the extension detects the page, scoops up the requested elements (text nodes, image URLs/blob data, form field values) and displays a quick “Save” action in the toolbar or within the page. • On click, the data is serialized, authenticated with the user’s token (already available in local storage) and posted to the app. • The extension also prompts the user to take certain actions on that webpage. Deliverables 1. Produc...
I am looking for a freelancer who can help me with scraping a real estate site in to my database.
I want a clear picture of how retail prices for cooking oils compare across the market right now. Your task is to gather current shelf or online prices for olive oil, canola oil, and sunflower oil sold in 500 ml, 1 L, and 5 L packs. Please cover a mix of national chains, local stores, and major online retailers so we capture the full spread of channels. What I need from you: • A well-structured spreadsheet (Excel or Google Sheets) listing supermarket name, channel type, date, brand, oil type, pack size, regular price, promo price if visible, unit price per litre, and any notes on discounts or availability. • A concise analysis (about 500-700 words plus clear charts) that highlights average prices, lowest and highest price points, and patterns you notice—especially diffe...
The development of a Robotic Process Automation (RPA) bot is required to optimize and automate the process of uploading insurance policies to a third-party registration website. The main objective is to reduce manual workload and improve efficiency in policy registration. The bot must perform the following functions: 1. Data Extraction: Capture key information from our internal policy issuance software (initially by reading an Excel file, but with the capability to soon read data via an API call to our system). The data to be extracted includes: * Neighborhoods * Policy number * Insurance company * Policy start date * Policy end date * Insured party data (name, ID number, last name, occupation). 2. Conditional Insured Party Registration: If an insured party's ID number does not alr...
I am trying to grow my list of emails for my business. I have a women's clothing brand, and I need to grow my email list of women customers in North America. Each day, I need at least 100 fresh, valid email addresses gathered from the web, verified, added to our customers email list, and send 5 emails a week. You need to be comfortable getting the relevant emails by yourself, adding them to our email list and creating one email a day mon- friday. This is an ongoing role.
Project Title: High-Volume Data Collection for AI-Powered Sports Card & Coin Grading **Budget:** $700–$1,200 (negotiable based on volume) **Duration:** 2–3 weeks **Project Overview:** We are building a cutting-edge vision LLM to become the world’s most accurate automated grader for sports cards and coins. To achieve this, we need a large, diverse dataset of high-quality photos showing both the front and back of graded sports cards (e.g., PSA, BGS) and graded coins (e.g., PCGS). Your role will be to source, organize, and deliver these images to train our AI models. **Key Requirements:** 1. **Image Specifications:** - Minimum 10,000 unique graded sports cards (baseball, basketball, football, hockey, and Pokemon) to each numerical grade and 7,000 unique...
I need a meticulous researcher to compile an up-to-date catalogue of Indian credit cards using only publicly available information from official bank sites and the RBI portal. Every entry should include the card name, issuing bank, a concise description of its rewards and cashback programs, travel benefits, and any low-interest features. Please add a detailed breakdown of all fees and charges you can locate—joining, annual, finance, late-payment, and foreign-transaction fees in particular. The final spreadsheet must be delivered in MS Excel, laid out cleanly with consistent column headings, clear currency formatting, and zero merged cells so that filtering and pivoting work smoothly. No scraping or automation: I want manually verified data with authoritative source links noted for...
I want a self-running workflow, built in n8n or an equally flexible platform, that pulls fresh B2C leads directly from a set of consumer-focused websites. The job is to configure the scraping logic, map the fields I need, and route each new lead automatically. I expect you to: • Build and document the scraping workflow • Set up a daily schedule and basic error handling/retry logic • Provide a clear hand-off guide The workflow should run without errors, populates the sheet with valid leads.
I need someone to migrate the content of roughly 200 Reddit posts into my new app. Each post contains between four and ten images. Every image has about 500 characters of accompanying text that must be copied over via simple copy-paste—no re-typing. Your task is to: • Download every image in full resolution. • Copy the text that appears directly beneath each image and paste it beside the corresponding image in the app. I only need the plain text; do not attempt to make links clickable. There are no special file-naming rules or folder structures to follow—the app organises everything once the images and text are in place. You can work in any order you prefer as long as every post is completed. When you reply, please quote your exact price for the entire jo...
I have a JSON file that needs to be read and its values written into a web form that contains well over ten separate text fields. The form has no checkboxes or dropdowns, so the task focuses purely on populating text inputs. A handful of those fields come with their own validation rules—length restrictions, date formatting, or specific character sets—and the solution has to respect these checks before the form can be submitted. My ideal workflow is a small, easily adaptable script (Python + Selenium, JavaScript + Puppeteer, or any equivalent you prefer) that: • loads the JSON, • maps each key to the correct input element, • applies or mimics the existing in-page validation where required, and • submits the form (or shows clear errors) once all data ...
Title: Build Hybrid Automation ORM Software (APIs + RPA + AI) Description: We are building CyberRepute, an automated Online Reputation Management (ORM) platform for 1500+ hotel clients. We need an experienced developer/team to help implement a hybrid automation system combining: Official APIs (Google Reviews, , Agoda) RPA / Power Automate for restricted portals (Medallia Brand Panels, Expedia, TripAdvisor) AI integration for sentiment analysis & response generation (Google Gemini) Web dashboard for human approval (React / Node.js) Key Responsibilities: Implement review fetching via APIs & RPA Integrate Google Gemini for AI-generated responses Build approval workflow dashboard Automate posting of approved responses back to platforms Ensure scalability for 1500+ hotel acc...
UK – Lead Scrape Specialist Project Description We are seeking an experienced Lead Scrape Specialist to collect verified B2B leads for companies in the Translation & Localization industry based in UK The primary data source for this project will be ZoomInfo. However, professionals with proven experience using RocketReach, Lusha, or similar B2B databases are also encouraged to apply. Accuracy, relevance, and ethical data sourcing are critical for this project. ________________________________________ Data Required (Per Company) • Company Name • Contact Person’s Name • Designation / Job Title • Business Email ID only (No Gmail, Yahoo, Outlook, Hotmail, or other free email domains) ________________________________________ Project Scope • Minimum 1,000 ...
I already have a lead list just shy of 5,000 rows that includes full names, company names, phone numbers, and—in most cases—the corporate website URL. What I am missing are accurate business-appropriate email addresses and LinkedIn URLs. I need you to enrich every row with both the personal profile and the company profile links on LinkedIn, then append the best-match email for each contact. The finished file should come back to me as a single spreadsheet—Excel or Google Sheets is fine—with the new columns added alongside my existing data. I will spot-check for valid syntax, matching names, and live LinkedIn pages, so please run any necessary verification before delivery.
I want to pull a clean, reliable snapshot of the 100 best-selling home-lighting items on MercadoLibre Uruguay () by querying their official API. What I need from you: • An Excel file containing for each product: – main image URL (or embedded image) – listing title – current price – discount percentage, when available – seller name / nickname – product rating – total units sold – clickable link to the listing I’ll use that table for a commercial market study, so data accuracy matters more than fancy formatting. If MercadoLibre requires pagination, auth tokens or rate-limit handling, please build those in; the solution must comply with their API terms. Deliverables The populated spreadsh...
I need a clean, reliable list of email addresses pulled from three sources—Facebook, Twitter, and the public-facing pages of selected company websites. Once gathered, every address should be entered into an Excel file laid out in multiple columns so I can immediately sort and filter by details like platform, page URL, company name (when obvious), and, of course, the email itself. I haven’t locked in a particular scraping method, so whether you rely on Python, Selenium, Beautiful Soup, custom APIs, or a well-honed manual workflow is up to you—as long as the data is accurate and the final workbook opens without errors. Quality matters more than volume; duplicates, malformed addresses, or non-existent inboxes won’t be acceptable. If a page or site blocks automated ac...
I want the last two years of China-to-India shipment information pulled from reliable trade-intelligence platforms such as Zauba, TradeAtlas, Volza or any equivalent source you already subscribe to. The spreadsheet must be company-wise, and I need exporter names to be rock-solid and complete, even if that means cross-checking more than one database. Importer names, quantity, value (in USD) and the HS code in 6-digit format should accompany each line so the file is ready for immediate analysis. Please scrape, clean and normalise the raw downloads so there are no duplicates, merged cells or hidden characters. Keep column headers simple and make sure numeric fields are truly numeric—I will be running pivot tables the moment I receive the file. Deliverables • One Excel workbook ...
I have a batch of information sitting on specific web pages that I need transcribed with precision into my own files. The job is purely text-based—no numbers or mixed content—so keen attention to spelling, punctuation, and layout consistency is essential. You will receive a list of URLs plus a simple template (Google Sheet or Word doc, whichever you prefer). Your task is to open each page, capture the required text exactly as it appears—headings, body copy, and any relevant captions—and paste it into the template in the correct fields. Please keep the original formatting cues (line breaks, bullet points) so the content stays readable. Accuracy and speed are my main priorities. If you spot obvious typos on the source page, flag them in a separate column rather than...
Was working on a business deal with the contact through intermediaries, but they have since dropped out, and I still need to work with my contact I have a full CIS with personal information of my prospective contact. Just need to find his WhatsApp number so I can continue.
I have a batch of digital documents—mainly PDFs and Word files—that contain information I need to capture, clean, and load directly into my database. The job is two-fold: first, mine the required data points from each file (think product specs, contact details, and any other fields that repeat across the set); second, structure those findings so they import smoothly into my existing tables. Accuracy matters more than speed here, so I’m looking for someone comfortable cross-checking entries, normalising formats, and flagging anything that looks inconsistent. If you prefer to automate parts of the workflow with Python, Excel Power Query, or similar scraping tools, that’s absolutely fine—as long as the final deliverable lands in my database exactly as outlined. ...
Hotel amneties suppliers in north india statewise. You can extract from all the websites on web or any source but you have to tell the source and also each and every entry should be accurate and precise for which they will be checked.
We are looking for an experienced online researcher specialist to help us build the most comprehensive possible list of people who have attended a specific school over its lifetime. (No Placeholder bids) Objective Create an advanced alumni name list by identifying as many former attendees as possible using ethical methods Scope of Work The freelancer will: Research publicly available sources to identify alumni names, including: websites like discord and luma Articles, blogs, interviews Event pages Alumni posts and testimonials Identify patterns, cohorts, timelines, and name variants Build a clean, structured database of names with source references Confirmed attendees Probable attendees Unverified mentions Allowed & Expected Methods Manual and automated data gathering...
We are seeking a skilled freelancer to extract specific data from websites. The ideal candidate will have experience in web scraping and data mining, with the ability to handle large datasets efficiently. The task involves collecting data from various online sources and organizing it into a structured format for analysis. Attention to detail and the ability to work independently are essential.
I have a batch of documents that must be transferred accurately into a web-based form. Everything you need to type will be visible in the scans/PDFs I supply; no online research or scraping is involved—just careful, precise data entry. What I need from you • Open each provided document, read the required fields, and key the information into the corresponding online form. • Double-check spelling, numbers, and formatting before submitting each record so that every entry matches the source exactly. • Keep me updated on your progress in case I need to clarify any abbreviations or handwriting. I’ll give you clear instructions, form credentials, and a small test file so we are both confident about the process before you dive into the full set. Fast turnaround i...
Project Overview I am seeking a high-level Data Engineer/Lead Generation specialist to build a comprehensive, deduplicated global database of Argentine Tango events (Festivals, Marathons, and recurring local Milongas). This project involves scraping niche directories that utilize complex dynamic loading, infinite scroll, "lazy loading," and embedded widgets. Scope of Work Data Extraction: Target ~3,000 Festivals/Marathons and ~30,000–50,000 local Milongas worldwide. Enrichment: Use tools like Apollo, Clay, or to find missing organizer emails and WhatsApp numbers. Deduplication: Merging records from multiple sources (e.g., if Source A has the time and Source B has the WhatsApp, the final record must contain both). Target Data Fields (The Schema) Every record must be del...
I have a ready-made list of product names. What I need is an end-to-end pipeline that, on every run, will: • Open a Google-only search for each product name • Capture the first eight organic result URLs in ranking order • Visit those pages, and pass it to Gemini or the ChatGPT API so the model can reliably extract price, currency, brand, description and any stated delivery time • Write the collected data to both CSV and JSON, saving fresh files after every run and tagging them with a timestamp • Survive the real-world web: captchas, bot checks, rate limits, timeouts, pages that hold multiple variants, or sites that omit a field entirely all need graceful handling and clear logging. Please package the code so I can spin it up on my own machine or a small cloud insta...
I have a ready-made list of product names. What I need is an end-to-end pipeline that, on every run, will: • Open a Google-only search for each product name • Capture the first eight organic result URLs in ranking order • Visit those pages, pull the raw HTML, and pass it to Gemini or the ChatGPT API so the model can reliably extract price, currency, brand, description and any stated delivery time • Write the collected data to both CSV and JSON, saving fresh files after every run and tagging them with a timestamp • Survive the real-world web: captchas, bot checks, rate limits, timeouts, pages that hold multiple variants, or sites that omit a field entirely all need graceful handling and clear logging. Please package the code so I can spin it up on my own machine or ...
B2C marketplace scraper needed to scrape, images, text and videos from online stores into an ebay listing on ebay UK and ebay USA. ebay have their own API for listing items. The software needs to be user friendly and easy to use and a web application. This is a simple project and there are simular websites that do the same but your software will offer more websites to scrape from. Key Specs: -Make an ebay title using AI chatgpt, you will be given a prompt (that the software can change this promt) -Copy images from the description -Copy text from the description -Multi platform scraping -Copy images from the listing gallery -make an ebay description by using AI chatgpt -Adding item specifics into ebay -scrape the videos from marketplaces using a chrome extension -Change the language usi...
I’m building an AI-powered web app that lets people search their own professional network as easily as they search the web. Users will be able to upload contact data coming from CSV files, LinkedIn exports, or Google Contacts; the system then parses and indexes the information so that a natural-language query—typed or spoken—instantly returns the most relevant contacts along with a short “why this matches” explanation. Core flow • Secure upload and parsing of the files above • Extraction of company, industry, location, and skills, then storage in a vector-friendly database • Natural-language and voice search that ranks contacts semantically and returns short rationale sentences • Clean, responsive UI (desktop and mobile) that shows...
I need a small but rock-solid tool that checks the UK government driving-test booking site around the clock, spots newly released or cancelled slots, “swaps” them against an existing booking ID I will supply, and pings me the moment a match appears. Key points • Continuous 24/7 polling with smart back-off so we never get banned. • Full captcha / session, CSRF, and queue handling built in. • Clean JSON output only; each record must include test-centre ID, date, time, and the timestamp you found it. • Preferred stack is JavaScript (Node) but I’m open to Python if you already have a proven framework. Deliverables 1. Scraper daemon with configurable polling interval and centre filters. 2. Swap-logic that injects my booking details and con...
I need assistance with extracting and organizing files from a ZIP archive. Requirements: - Extract files from a ZIP file - Organize extracted files into folders by file type Ideal Skills: - Experience with file extraction tools - Organizational skills to create a tidy folder structure
I need a ZIP file downloaded from an online storage service (like Google Drive or Dropbox) and extracted. After extraction, the files should be saved to a specific location. Requirements: - Access to online storage services - Ability to download and extract ZIP files - File organization skills Ideal Skills and Experience: - Experience with file handling and extraction - Familiarity with Google Drive, Dropbox, or similar services - Attention to detail and reliability
I already have a master list of company names—these companies import Jaggery from India, so all of them are food companies, most of them active in the Export & Import arena and located across several different countries. Your task is to enrich that list with reliable, up-to-date contact details. For each company, I need: • the official website URL • a verified email address (generic or direct) mostly i need email ids of decesion makes or person from purchase department • the main phone number plus any direct mobile line you can uncover - to be put seperately • a link to the web-form or “Contact Us” page, if one exists Please cross-check every data point against the company’s own site or a trusted directory to ensure accuracy. D...
I need a clean, well-structured CSV that lists email addresses and WhatsApp numbers for wedding videographers across every U.S. state. Please source the data from both Instagram and any websites that the accounts link to. Key requirement: include only videographers (no photographers) who have posted an Instagram Reel or video within the past 30 days so I can be sure they are actively shooting weddings. Alongside each contact, capture the business name and the state or city shown on their profile or site. Deliverables – all in a single CSV: • Email address • WhatsApp number (or mobile clearly marked for WhatsApp) • Business name • Location (state and, when available, city) • Direct link to the Instagram post that proves the account’s rec...
I need an Excel workbook that automatically pulls the latest daily prices for lumber, cement, bricks and other common building materials directly from Jewson and Travis Perkins. The sheet should refresh on schedule (once every 24 hours is fine) without any manual copy-and-paste. Core functions I’m expecting: • A clean, user-friendly dashboard that shows today’s price alongside yesterday’s and a simple % change. • Behind the scenes, a reliable connection—Power Query, a public API, or a well-behaved web-scraper—to both merchants’ sites so the data flow keeps working even if product counts grow. • Simple controls for me to add or remove SKUs or whole categories as my project list changes. • No macros that trigger endless secur...
If you want to stay competitive in 2021, you need a high quality website. Learn how to hire the best possible web developer for your business fast.
Learn how to find and work with a top-rated Google Chrome Developer for your project today!
Learn how to find and work with a skilled Geolocation Developer for your project. Tips and tricks to ensure successful collaboration.