The ultimate guide to hiring a web developer in 2021
If you want to stay competitive in 2021, you need a high quality website. Learn how to hire the best possible web developer for your business fast.
Selenium is an automation tool that helps to test web applications and quickly author functional tests without the need of a test script. Once the tests are written, they can be executed on different platforms, browsers and mobile devices, experiencing how the application behaves in a complete cycle. Selenium Experts are hands-on IT professionals who can develop and execute a series of automated tests and maintain existing automation frameworks to ensure fast, quality delivery of web and mobile applications.
Selenium Expert can be instrumental in helping a client maximize their application’s quality assurance representation, by saving them time and resources on manual testing procedures. An Expert can develop test scripts with the popular open source automation framework, and implement continuous integration for the applications for decreased functional defects, freeing up time for other tasks or development efforts.
Here's some projects that our expert Selenium Experts made real:
Selenium is an invaluable addition to any development team that requires application testing on multiple platforms simultaneously. By hiring a Selenium Expert on Freelancer.com, clients are sure to effectively reduce time spent identifying software or device bugs with ease. They will spend less time looking at log files and more time investing resources into making their applications market ready. Invite your clients to post their project today and hire a Selenium Expert on Freelancer.com!
From 22,704 reviews, clients rate our Selenium Experts 4.91 out of 5 stars.Selenium is an automation tool that helps to test web applications and quickly author functional tests without the need of a test script. Once the tests are written, they can be executed on different platforms, browsers and mobile devices, experiencing how the application behaves in a complete cycle. Selenium Experts are hands-on IT professionals who can develop and execute a series of automated tests and maintain existing automation frameworks to ensure fast, quality delivery of web and mobile applications.
Selenium Expert can be instrumental in helping a client maximize their application’s quality assurance representation, by saving them time and resources on manual testing procedures. An Expert can develop test scripts with the popular open source automation framework, and implement continuous integration for the applications for decreased functional defects, freeing up time for other tasks or development efforts.
Here's some projects that our expert Selenium Experts made real:
Selenium is an invaluable addition to any development team that requires application testing on multiple platforms simultaneously. By hiring a Selenium Expert on Freelancer.com, clients are sure to effectively reduce time spent identifying software or device bugs with ease. They will spend less time looking at log files and more time investing resources into making their applications market ready. Invite your clients to post their project today and hire a Selenium Expert on Freelancer.com!
From 22,704 reviews, clients rate our Selenium Experts 4.91 out of 5 stars.I need a robust Python function that can log in to a password-protected site, navigate to a given page, locate the primary table, and convert it into a clean Pandas DataFrame before writing the result to CSV. The same function must work on each URL I provide, and ideally on any future page built on the same template, so please keep the approach modular and scalable. Because the pages sit behind authentication, the username and password can be hard-coded directly in the script; no interactive prompts or external files are necessary this time. Anti-blocking tactics (session persistence, realistic headers, controlled request pacing, etc.) are mandatory—I want to be able to run the notebook repeatedly without getting shut out. Deliverables • A Jupyter notebook (.ipynb) containin...
I have to pull many thousands of PDF files from a publicly available but poorly structured online database. The pages are slow, there are no clear download links, and navigation relies on clunky JavaScript forms, so a straightforward “save as” approach will take far too long. You will receive a text file that contains the exact filenames for every document I need. Those filenames appear in the HTML once the record is loaded, so they can be used as reliable anchors for the scrape. The order in which the files arrive does not matter; accuracy and completeness do. I expect an automated approach—Python with Selenium, Playwright, Scrapy, or any comparable tool is fine—as long as it can work around the site’s fragile structure and occasional timeouts. If headles...
The goal is to end up with a clean, scalable Selenium WebDriver framework in Java that any engineer on my team can pick up and extend. It must follow the Page Object Model throughout, execute through TestNG, and generate rich reports with either Extent or Allure—whichever you feel gives the clearest insights. Parallel execution is a must, and every test should run reliably on Chrome, Firefox, and Edge. Running the suite in headless mode isn’t required for now. A tidy repository structure, meaningful package naming, and a well-commented README are just as important to me as the code itself, so please factor that documentation time in. All work will be version-controlled in a new GitHub repository that you initiate and populate with an agreed branching model. Deliverables &bu...
I’m building an automated, end-to-end pipeline that pulls results from a fast-changing election website every few minutes, cleans and enriches the feed with AI, then pushes out clear charts and graphs that highlight vote counts per party, regional voting trends, overall voter turnout and a concise statistical summary of the outcomes. There is an existing Google Data Studio (Looker) template available. The scope breaks down into three tightly-linked steps: • Data capture – a headless, resilient scraper (Selenium, Playwright or a similar tool) must track roughly 3 500 individual race entries across 17 political parties in six regions, coping smoothly with AJAX calls, pagination and any CAPTCHA or session refreshes. • AI-powered processing – once ingested, th...
Project Description: We are looking for an experienced QA specialist to thoroughly test our car repair platform () and document all findings. Scope of work: • Test full user flows for three roles: – Admin – Workshop Owner – Customer (Gmail login) • Verify all transactional emails (correct subject, content, and links) • Test on: – Mobile (iOS & Android – Chrome & Safari) – Desktop (Windows & macOS – major browsers) Deliverables: • Detailed bug report with: – Steps to reproduce – Screenshots or video – Severity level – Improvement suggestions • List of executed test cases with status (OK / Fail) Mobile experience is the priority, but desktop must also be fully tested. We need a compl...
I am looking for an experienced developer who can implement a solution to send POST API requests directly from an active browser session, maintaining full session integrity and security context. The requirement is to replicate browser network requests exactly as they occur in DevTools — including all dynamic headers, cookies, tokens, and Akamai-related security parameters — without triggering bot protection or security blocks. Core Requirements: API request must originate from the same opened browser session Must reuse: All request headers Session cookies CSRF tokens Authorization tokens Dynamic security tokens Akamai-related parameters Must work with protected endpoints (Akamai / Bot Manager enabled) Should avoid 403, 401, 504, or security validation failures No e...
Project Description I am looking for an experienced developer who can implement a solution to send POST API requests directly from an active browser session, maintaining full session integrity and security context. The requirement is to replicate browser network requests exactly as they occur in DevTools — including all dynamic headers, cookies, tokens, and Akamai-related security parameters — without triggering bot protection or security blocks. Core Requirements: API request must originate from the same opened browser session Must reuse: All request headers Session cookies CSRF tokens Authorization tokens Dynamic security tokens Akamai-related parameters Must work with protected endpoints (Akamai / Bot Manager enabled) Should avoid 403, 401, 504, or security val...
I have a collection of websites that hold the textual information I need consolidated into a single, well-structured dataset. Rather than copying the material manually, I want the process handled through reliable web-scraping tools so the capture is fast, consistent, and repeatable. Your task is straightforward: • Build (or adapt) a scraper that targets the pages I specify, pulls only the relevant text, and skips ads, navigation links, and other noise. • Deliver the harvested content in a clean CSV or Excel file with clear column headings; if you prefer a database export, let me know and we can adjust. • Include the finished script or notebook so I can rerun the extraction later. Accuracy and formatting matter more to me than sheer speed, so please allow time for basic...
Project Title: WhatsApp Transfer Verification & Auto Receipt System Project Description: I have a WhatsApp group where my sales representatives send customer bank transfer receipts (images or text). I want to build an automated system (Agent/Bot) that does the following: 1. Monitor incoming WhatsApp messages from the sales group. 2. Detect and extract transfer information such as: - Amount - Sender name - Date - Reference number 3. Open the bank web portal (already logged in on the device) and verify whether the transfer has been received. 4. If the transfer is valid: - Send confirmation back to the sales representative on WhatsApp. - Mark the transfer as approved. 5. Automatically open my company accounting system (web-based). 6. Create and submit a Receipt Vouch...
I need a reliable browser-based bot that logs in to my Amazon A to Z account, continuously monitors upcoming shifts and instantly books the ones that match criteria I will later define (location, start time, hours). The tool must operate inside a standard web browser—headless Chrome or a Selenium/Puppeteer script is fine—as long as it books faster than manual clicking and survives Amazon’s usual page refreshes, captchas, and timers. I will grant the bot full access to my account solely for booking purposes, so handling login securely (2FA, encrypted credentials, cookie reuse) is essential. A simple configuration file or UI where I can tweak preferred warehouses, shift lengths, and daily booking windows would be helpful, but speed and reliability come first. Deliverables...
I’ve created a Google Sheet that will act as a central “lead collector,” and I need it filled with fresh, accurate contact data pulled from publicly available sites. The focus is simple yet crucial: for every company you find, capture the homepage URL and a working email address. (ask for details in the sheet ) A completely ethical approach is non-negotiable—no gated content, no third-party lists, and no automated harvesting that violates site terms. I’m happy for you to use tools you’re comfortable with (Python, Scrapy, BeautifulSoup, Selenium, Google Apps Script, etc.) as long as you respect and rate limits. Email addresses must appear in plain text within the sheet; please avoid hyperlinks or HTML encoding. Deliverables • A Google Sheet ...
My team has a fully-built web application and I need a fresh set of expert eyes to validate every line of code. The goal is to run a complete test cycle—functional, performance, and security—so we can ship with confidence. You are free to recommend and use whichever framework or tool best fits the job; I am open to industry standards such as Selenium, Playwright, JMeter, OWASP ZAP, or anything you feel will surface issues quickly and reliably. What matters most is deep coverage of the entire source code, clear defect reporting, and actionable suggestions for hardening and optimisation. I will provide the full repository, environment details, and access credentials once we agree on the engagement. In return, I expect: • A concise test plan outlining scope, assumptions, ...
I already have a curated list of LinkedIn profile URLs and need the key networking details moved into a single Google Sheet. For every profile, capture each person’s stated interests and list the five types of people they say they want to meet. Those “meet-up” types should be tagged under the three clear categories I care about—Industry experts, Potential clients and Collaborators—so that I can later filter the sheet by networking goal. Please place one profile per row in the Google Sheet and create separate columns for: • Profile URL • Name (as it appears) • Interests (comma-separated) • Type 1 through Type 5 (verbatim wording) • Category tag (Industry experts / Potential clients / Collaborators) Accuracy of the text you...
I need a small utility that sits in the background, pings a single public web page every second, and alerts me the moment it detects any difference in either the visible text or the images. The page updates unpredictably, so true real-time tracking is essential; a one-minute polling interval is already too slow for my use-case. A lightweight approach that respects the site’s bandwidth and avoids triggering blocks or captchas will be valued. I am open to whatever stack you favour—Python with BeautifulSoup or Selenium, Node.js with Puppeteer, or a compiled solution—so long as it is stable on a Windows environment and easy for me to tweak the target URL later. Notification method is flexible: an email is fine, but if you have a smarter suggestion (desktop toast, webhoo...
I’m spinning up additional quality coverage for an active web and mobile project and need a seasoned tester who can switch easily between writing code-driven checks and running thoughtful manual sessions. Your day-to-day will include drafting concise test plans, building and maintaining Selenium-, Cypress- or Playwright-based suites, and verifying APIs with Postman or a similar tool. When something breaks, I’ll count on you to log reproducible defects and follow them through to closure, collaborating directly with developers in our Agile board so each sprint ships clean. Because delivery timelines are tight, I’m specifically after someone who already has strong automation foundations, understands functional, regression and end-to-end coverage, and can work independently ...
Title: Contract QA Automation Engineer – Onsite (3–6 Months) Project Overview: We are hiring QA Automation professionals for a short-term contract (3–6 months) to support active development projects. Location (Onsite/WFO): Udaipur | Jaipur | Bangalore | Bhopal Skills Required: Proven experience in automation testing Selenium / Cypress / Playwright or similar tools API testing knowledge Strong problem-solving skills Contract Details: Duration: 3–6 months Full-time engagement Onsite only (no remote option) Who Should Apply: Freelancers or contract professionals available for onsite work Immediate to short notice joiners preferred
I need an end-to-end AI agent that automatically scouts freelancing websites, general job boards, and specialised training platforms for roles or courses that involve artificial-intelligence work. The agent must: • Crawl and scrape the relevant pages in real time or on a frequent schedule. • Apply NLP or other classification techniques to decide whether a posting is truly AI-related, then tag it by sub-domain (e.g. vision, NLP, MLOps, prompt-engineering). • Deliver concise, deduplicated listings to me through an in-app notification feed—no email or SMS required. For the deployment side I’m open to Python (Scrapy, BeautifulSoup, Selenium), Node, or any stack you are comfortable with so long as it is containerised and can run unattended on a small cloud insta...
Hey! I’m looking to hire an experienced developer to build a universal product-detail scraping pipeline that takes a product URL (any website) and returns a complete structured product record. This is not a “simple HTML parse.” Many target sites are React/Next/Vue, load content via XHR/GraphQL, hide details behind tabs/accordions/modals, and lazy-load images/PDFs. The solution needs to reliably extract everything a human can see on the page, plus the underlying data used to render it. What the scraper must do (high level) Given a product URL, the pipeline should: Load the page like a real user (handle cookies/overlays). Capture all content from multiple sources (DOM + network + interactions). Use GPT API strategically to increase accuracy (field mapping, variant ext...
I need end-to-end testing for my web application. The project code is already written, so you will be focusing solely on testing. Requirements: - Thoroughly test on Chrome - Ensure all workflows function as intended - Identify and document any bugs or issues Ideal Skills: - Experience with end-to-end testing tools (e.g., Selenium, Cypress) - Strong attention to detail - Familiarity with web applications and browser testing Please provide a brief overview of your testing experience and any relevant tools you plan to use.
I need a reliable solution that can pull data from LinkedIn and insert it straight into a database I specify. The core requirement is the automated transfer—once the tool finishes scraping, every captured field should already be sitting in the database ready for queries and reporting, no manual copy-paste. You’ll advise me on the best approach to authenticate, respect rate limits, and minimise the risk of blocks while still collecting the typical profile-level details (name, headline, company, location, experience, education, skills and anything else you can legally obtain). I will confirm the final field list before you begin. Key objectives • Build or configure a scraper / API wrapper that logs in, navigates to each target profile and captures the agreed-upon fields ...
I need end-to-end testing for my web application. The project code is already written, so you will be focusing solely on testing. Requirements: - Thoroughly test on Chrome - Ensure all workflows function as intended - Identify and document any bugs or issues Ideal Skills: - Experience with end-to-end testing tools (e.g., Selenium, Cypress) - Strong attention to detail - Familiarity with web applications and browser testing Please provide a brief overview of your testing experience and any relevant tools you plan to use.
I want to take my current, intermediate knowledge of Dot Net C# and turn it into solid, real-world expertise in automation. The goal is to design and implement a clean, maintainable test framework focused on integration testing, then practise troubleshooting the kinds of issues that appear in day-to-day work. Here’s what I need from you: • Live, screen-sharing sessions where we build the framework together, explaining each architectural choice and its trade-offs. • Practical guidance for structuring test projects, organising test data, and wiring the framework into a CI pipeline. • Walk-throughs of flaky-test triage, mocking external dependencies, and debugging failures that only show up in complex environments. • Short take-home exercises or sample reposi...
Job Description: QA Engineer – Cybersecurity Product (Reverse Proxy Server) Position Title: QA Engineer Location: Mumbai, India (Hybrid/Remote options available) Domain: Cybersecurity – Reverse Proxy Server Development and enterprise data security while using the cloud services Framework: Agile/Scrum Role Overview We are seeking a detail-oriented and security-aware QA Engineer to ensure the quality, reliability, and resilience of a cloud-native reverse proxy server. You will be responsible for designing and executing test plans, automating test cases, and validating security features in a fast-paced Agile environment. Your work will directly contribute to the robustness of a critical cybersecurity product. Key Responsibilities • Design, develop, and execute manual...
Job Title: Playwright Automation Engineer (Java) – Setup & Framework Implementation Job Description: are looking for an experienced Playwright Automation Engineer (Java) to set up and implement an automation testing framework for our product. ideal candidate should have strong hands-on experience in Playwright with Java and be capable of building a scalable, maintainable automation framework from scratch. Responsibilities: up Playwright automation framework using Java and implement a robust test automation architecture with Maven/Gradle and CI/CD pipelines Page Object Model (POM) or similar design pattern reporting (Allure/Extent Reports or similar) reusable and maintainable test scripts documentation for setup and usage Required Skills: experience in Playw...
Project Description: Find school districts and charter schools who use a specific vendor for a large list of domains. I am seeking an experienced web scraping specialist to improve our Python script to analyze a large list of school district websites (approximately 4000+ URLs) and identify the ones who show a specific link on any page found in their sitemap. The primary method of identification must be to scan the website's for specific, known vendor links. Deliverables Required 1. A Production-Ready Python Script (.py file): The script must be commented, easily configurable, and capable of reading the provided CSV list, performing the scan, and generating the output CSV. It should handle timeouts and basic error handling gracefully. 2. The Final Results (CSV/Excel File): A c...
I need a Selenium-based solution that runs reliably on Windows and opens Google Chrome to simulate human visits to LinkedIn (and occasionally other) profile URLs listed in a Google Sheet. For each URL the program should: • Pull the next unused link from the sheet • Load the page in Chrome, wait a random time between 20 seconds and 3 minutes • Apply truly randomized scrolling patterns while the profile is open so behaviour looks organic • Fire a webhook the moment the visit completes, passing back any ID or payload I define so our CRM reflects the touch instantly Configuration items such as Google Sheet ID, webhook endpoint, minimum/maximum dwell time, and daily visit caps should live in a simple file I can edit without touching code. A short README on installi...
We are building a full internal marketplace analytics web system, not just a reporting script. The system is designed to combine competitive intelligence with internal sales and stock analytics in a single interface. Functional Requirements The system must provide the following capabilities: 1. Product and SKU structure - Each product must be split into individual SKUs based on flavor and volume. - All analytics and reports are built at the SKU level. 2. Our product analytics (primary focus) - Current stock levels (total and per SKU). - Sales volume for selected periods (daily / weekly / monthly). - Reorder recommendations based on stock thresholds and sales dynamics. - Revenue calculations per product and per SKU with period filtering. 3. Competitive analytics - Automated collection o...
I need a reliable script that logs into Placementindia and pushes my vacancy details without any manual input. The tool should store template data, fill every required field, submit the form, and then confirm success so I can see at a glance what went live. Inside a small, browser-based dashboard I want two key abilities: • schedule automation so I can decide the exact days and times each role goes out • one-click “auto post” that instantly publishes a job when I hit the button The posting frequency isn’t fixed; some days I may blast several openings, other weeks none at all, so the scheduler has to respect whatever plan I set. Use whichever method makes the process most stable—headless browser automation (Puppeteer, Playwright, Selenium) or direc...
Complete Lottery Prediction and Betting Automation System (Focused on Loterías y Apuestas del Estado - Spain) 2. System Features 2.1. Historical Data Collection and Update The system must automatically download complete historical results (drawn numbers, draw dates, prize breakdowns by category, accumulated jackpots) from the first draw of each lottery, directly from or reliable associated sources. Specific sources: Euromillones: (since Feb 13, 2004) La Primitiva: (since Oct 17, 1985 – modern version) El Gordo de la Primitiva: (since Oct 31, 1993) Updates automatic at exactly 00:02 the day after each draw, using ethical scraping (BeautifulSoup/Scrapy) with proper user-agent headers to mimic human behavior. Store data in PostgreSQL (structured) or MongoDB (flex...
I need a small, Windows-friendly Python script that will open a real browser with Selenium and wipe large batches of content from my X (Twitter), Facebook, and Instagram accounts. Because my X account sits on the free API tier I keep running into 403 errors, so this project must rely solely on browser automation—no official APIs or paid third-party tools. Here’s what I’m after: the script launches from the command prompt, asks for (or reads from a .env) my login credentials, signs in, and then iterates through all visible posts, tweets, and reels, deleting each one until none remain or until it hits an optional stop condition such as a date or a post count I can set. A simple console printout like “Deleted tweet #42” is enough for logging; I don’t need ...
IM TYRING TO RUN THE ATTACHED JPNY SCRIPT TO GET INFO FROM A WEBSITE BUT I CANT UNDERSTAND IT DOESN'T WORK. I NEED THIS SCRIPT TO BE FIX + PAGINATION TO FETCH AROUND 2400 RECORDS FOR YELLOWPAGES I ONLY USE JUPYTER
I’m looking for a data engineer who can take full ownership of a daily web-scraping workflow aimed at ongoing market research. The job centers on extracting selected data points from public web pages, transforming them into a clean, structured format, and making them available for analysis every 24 hours. Here’s what I need you to handle from end to end: • Source acquisition – fetch HTML from the URLs I provide, even when content is hidden behind JavaScript (a headless browser such as Playwright or Selenium is fine). • Parsing & cleansing – pull the specific fields I’ll list (product name, price, SKU, availability, and a time-stamp), remove duplicates, and standardize values. • Storage & delivery – load the daily output into ...
Je veux constituer un annuaire exhaustif de toutes les bases nautiques en France métropolitaine et DOM-TOM offrant jetski, bouée tractée, flyboard, ski nautique, wakeboard, parachute ascensionnel ou encore location de bateaux. Pour y parvenir, j’ai besoin d’un workflow automatisé reposant exclusivement sur Google Maps – c’est la source retenue – capable de collecter, dédupliquer puis nettoyer les données avant de les mettre en forme dans un CSV directement importable dans le CMS Wix. Les coordonnées devront être fournies en degrés décimaux. Livrables attendus • Un script réutilisable (Python + Selenium, Scrapy ou équivalent) qui interroge Google Maps, gère le r...
If you want to stay competitive in 2021, you need a high quality website. Learn how to hire the best possible web developer for your business fast.
Learn how to find and work with a top-rated Google Chrome Developer for your project today!
Learn how to find and work with a skilled Geolocation Developer for your project. Tips and tricks to ensure successful collaboration.