Selenium jobs
This is a 10 - 30 USD task, If you are not willing to accept this Task Budget, Please don't Bid I want a self-contained engine that reliably boosts my Twitter presence through a controlled mix of rea...uptick in all three metrics—follower count, engagement rate, and profile visits—visible inside native Twitter Analytics. • A dashboard or log file proving every action your code performed, with timestamps and success/fail notes. • Clear setup notes so I can redeploy the system on another VPS without hand-holding. If you have prior work with Twitter API v2, automation frameworks like Puppeteer or Selenium, or experience farming aged accounts safely, let me see it. Security, human-style timing, and the ability to fine-tune rules on the fly are far more imp...
I'm seeking an experienced Python developer to help with data analysis and automate data scraping tasks. Key Requirements: - Develop Python scripts for data analysis. - Automate data scraping from websites. - Deliver clean, structured datasets for analysis. Ideal Skills: - Proficiency in Python, especially for data manipulation. - Experience with libraries like BeautifulSoup, Scrapy, or Selenium. - Strong background in data analysis, preferably with Pandas or NumPy. - Familiarity with data storage solutions (e.g., SQL, NoSQL). Looking for someone who can write efficient, reliable scripts and has a keen eye for detail.
... and any email address the site provides. Because often hides data behind pagination or pop-ups, I expect a robust scraping approach that can handle dynamic content (Selenium, Playwright, or similar) as well as polite rate-limiting so we stay within acceptable request volumes. Deduplication is essential—if the same company appears under multiple categories or listings, merge the records instead of inflating the count. Deliverables • One clean .xlsx file containing all requested fields, ready for filtering and analysis • A brief text log explaining the scraping workflow, libraries used (e.g., Python–Selenium/BeautifulSoup, Node–Puppeteer, etc.), and any known data gaps • Confirmation that the crawl completed for every U.S. state and terri...
...warning box under “About this item” (must be extracted بالكامل with formatting) * Activation guide (especially for software products) 3. Data Alignment * Must correctly match scraped data with API data (using product ID and slug) * No duplicates or mismatched records --- Technical Requirements: * Python (preferred with FastAPI or similar) * Experience with headless browsers (Playwright / Selenium / Puppeteer) * Ability to handle dynamic content and anti-bot protections * Experience with scalable scraping (parallel workers, batching, queues) * Strong error handling, retry logic, and logging --- Infrastructure: * Scraper will run on a dedicated VPS (8 CPU / 32GB RAM) * Must support parallel execution * Must not affect the main Laravel application --- Milestone...
I'm seeking an experienced developer to improve my existing front-end tests using Java, Selenium, and Cucumber. Key tasks include: - Analyzing current test suite - Identifying and resolving inefficiencies - Enhancing test coverage and reliability Ideal skills and experience: - Proficiency in Java, Selenium, and Cucumber - Strong background in front-end testing - Experience with test optimization and debugging Looking for someone detail-oriented, with a strong problem-solving mindset.
...brand-new WhatsApp group that I own and silently populate it with every member of another group where I am only a participant. What I lack is the workflow or tool that can take those numbers and place each person into my new group automatically—no manual invites, no forwarding links one by one. Ideally the solution relies on a small script, macro, or API-based workaround (WhatsApp Business API, Selenium, or another proven method) that: • Creates or lets me create a fresh group where I’m the sole admin • Adds the full contact list in one action, or as close to one action as WhatsApp will allow • Provides clear setup instructions so I can repeat the process later without developer assistance Please advise on any technical, policy, or rate-limit con...
...triggers follow-up actions with zero human touch. Because of the pace and quality I need, I’m limiting this contract to established teams that can show at least 1,000 completed-project reviews on Freelancer. If your profile meets that bar and you can stay responsive during UK business hours, we can start right away. Deliverables • Day 1: confirm architecture and tool stack (Python, GPT-4/LLM API, Selenium or Playwright, Zapier/Make, etc.) and supply a working skeleton • Day 2: build out every feature, integrate external APIs, and begin user acceptance testing • Day 3: final polish, live deployment on my server, and a short video walk-through Acceptance criteria • The automation runs the agreed workflow flawlessly on my infrastructure &bull...
...triggers follow-up actions with zero human touch. Because of the pace and quality I need, I’m limiting this contract to established teams that can show at least 1,000 completed-project reviews on Freelancer. If your profile meets that bar and you can stay responsive during UK business hours, we can start right away. Deliverables • Day 1: confirm architecture and tool stack (Python, GPT-4/LLM API, Selenium or Playwright, Zapier/Make, etc.) and supply a working skeleton • Day 2: build out every feature, integrate external APIs, and begin user acceptance testing • Day 3: final polish, live deployment on my server, and a short video walk-through Acceptance criteria • The automation runs the agreed workflow flawlessly on my infrastructure &bull...
...triggers follow-up actions with zero human touch. Because of the pace and quality I need, I’m limiting this contract to established teams that can show at least 1,000 completed-project reviews on Freelancer. If your profile meets that bar and you can stay responsive during UK business hours, we can start right away. Deliverables • Day 1: confirm architecture and tool stack (Python, GPT-4/LLM API, Selenium or Playwright, Zapier/Make, etc.) and supply a working skeleton • Day 2: build out every feature, integrate external APIs, and begin user acceptance testing • Day 3: final polish, live deployment on my server, and a short video walk-through Acceptance criteria • The automation runs the agreed workflow flawlessly on my infrastructure &bull...
...descriptions, prices, SKU codes, category paths and any other on-page text detail), then supply the data back to me in a spreadsheet-ready format such as CSV or XLSX. I do not need images—only the textual content. Any solution you craft must respect the site’s and avoid rate-limit issues so nothing gets blocked. A Python script using requests/BeautifulSoup, Scrapy, or a headless browser (Selenium, Playwright) is perfectly fine as long as your code is clearly commented and reusable; if you prefer another language, that works too provided the result meets the same standards. Deliverables: • Complete dataset of all product listings in CSV or XLSX • The scraping script with brief run instructions • One short report summarising page coverage and any s...
I need a clean, one-time extraction of every registered agent listed on the Kerala RERA portal. The scope is limited to the publicly displayed agent name plus all available contact details—phone numbers, email addresses, and office addresses. No licence-status fields or property listings are required. Any stack is fine—Python (BeautifulSoup, Scrapy, Selenium), Node.js, or a headless browser workflow—as long as it handles pagination, hidden rows, or JavaScript-rendered tables and respects polite scraping practices. Deliverables • An Excel workbook (.xlsx) containing one row per agent and clearly labelled columns for Name, Phone, Email, and Address • Data fully deduplicated, UTF-8 compliant, and free of blank placeholders • A short note on th...
...triggers follow-up actions with zero human touch. Because of the pace and quality I need, I’m limiting this contract to established teams that can show at least 1,000 completed-project reviews on Freelancer. If your profile meets that bar and you can stay responsive during UK business hours, we can start right away. Deliverables • Day 1: confirm architecture and tool stack (Python, GPT-4/LLM API, Selenium or Playwright, Zapier/Make, etc.) and supply a working skeleton • Day 2: build out every feature, integrate external APIs, and begin user acceptance testing • Day 3: final polish, live deployment on my server, and a short video walk-through Acceptance criteria • The automation runs the agreed workflow flawlessly on my infrastructure &bull...
...triggers follow-up actions with zero human touch. Because of the pace and quality I need, I’m limiting this contract to established teams that can show at least 1,000 completed-project reviews on Freelancer. If your profile meets that bar and you can stay responsive during UK business hours, we can start right away. Deliverables • Day 1: confirm architecture and tool stack (Python, GPT-4/LLM API, Selenium or Playwright, Zapier/Make, etc.) and supply a working skeleton • Day 2: build out every feature, integrate external APIs, and begin user acceptance testing • Day 3: final polish, live deployment on my server, and a short video walk-through Acceptance criteria • The automation runs the agreed workflow flawlessly on my infrastructure &bull...
...triggers follow-up actions with zero human touch. Because of the pace and quality I need, I’m limiting this contract to established teams that can show at least 1,000 completed-project reviews on Freelancer. If your profile meets that bar and you can stay responsive during UK business hours, we can start right away. Deliverables • Day 1: confirm architecture and tool stack (Python, GPT-4/LLM API, Selenium or Playwright, Zapier/Make, etc.) and supply a working skeleton • Day 2: build out every feature, integrate external APIs, and begin user acceptance testing • Day 3: final polish, live deployment on my server, and a short video walk-through Acceptance criteria • The automation runs the agreed workflow flawlessly on my infrastructure &bull...
...triggers follow-up actions with zero human touch. Because of the pace and quality I need, I’m limiting this contract to established teams that can show at least 1,000 completed-project reviews on Freelancer. If your profile meets that bar and you can stay responsive during UK business hours, we can start right away. Deliverables • Day 1: confirm architecture and tool stack (Python, GPT-4/LLM API, Selenium or Playwright, Zapier/Make, etc.) and supply a working skeleton • Day 2: build out every feature, integrate external APIs, and begin user acceptance testing • Day 3: final polish, live deployment on my server, and a short video walk-through Acceptance criteria • The automation runs the agreed workflow flawlessly on my infrastructure &bull...
...appears on bridebook.co.uk. We need the suppliers of each category from the whole UK, so it will be roughly 14,000 records. What we actually need in the spreadsheet is very simple: the Suppliers Names & Emails. Nothing more. Please deliver one Excel file with two clearly-labelled columns (Name | Email). We have no preference on how you collect the data—manual collection, Python/BeautifulSoup, Selenium, or any other web-scraping approach is fine—as long as the final sheet is accurate and complete. Don't just place your BID blindly, rather visit the link and explore Suppliers Category section - there are many like Florist, Music, Videographers, etc. Fixed Budget: $50 We have certain strict rules to abide by so those who follow those strictly can only be hi...
...clean, comma-separated CSV file ready for downstream analysis. Key needs • Works against search terms, locations, or URLs I pass in at runtime. • Handles pagination, scrolling, and any “Show More” expansions so the entire description is captured, not just the preview. • Respects reasonable request rates or uses rotating headers/proxies so it avoids being blocked. • Runs head-less (Python + Selenium, Playwright, or similar libraries are fine) and is easy to re-run. Deliverables 1. Source code with brief setup instructions. 2. Example CSV created from a small test run so I can confirm the format. 3. Short read-me explaining how to change search parameters and any environmental variables (e.g., proxy list, delay settings). I can test in a...
...travel consultancy looking for a highly skilled Python Developer to design and maintain high-performance web automation systems for high-traffic platforms (VFS Global, iDATA, BLS, and Kosmos). The goal is to optimize data entry and appointment scheduling workflows to ensure maximum efficiency for our clients. Technical Requirements: • Advanced Web Automation: Proficiency in Python with Playwright, Selenium, or Puppeteer. • Security Navigation: Deep experience in handling sophisticated web security layers (Cloudflare, DataDome, and advanced WAFs) using ethical and compliant methods. • Request Management: Expert knowledge of HTTP headers, session persistence, and TLS fingerprinting to mimic natural user behavior. • Data Integrity: Integration with 3rd party veri...
...fixing the errors, ensure that all data presented in the website is accurate and correctly reflects the information from the respective travel agencies. Collaborate on requirements: Discuss specific requirements and understand the nature of the data discrepancies before implementing fixes. Key Skills Required: Web Scraping: Experience in web scraping frameworks (e.g., BeautifulSoup, Scrapy, Selenium). Data Analysis: Ability to work with data structures, data cleaning, and presentation. Familiarity with Pandas or similar tools is a plus. Backend Development: Proficiency in backend technologies (e.g., Python, Node.js) to handle web scraping and data processing. Frontend Development: Experience with frontend technologies (e.g., React, HTML/CSS, JavaScript) to present data in a use...
...you have recommendations on sources we are open to them. Stealth & security to run quickly without having accounts being banned too fast is key. Vehicle search profiles should be easy for me to tweak without touching the core logic, preferably via a simple config file. The solution can run headless on a VPS or cloud function; feel free to choose the stack you’re most comfortable with (Python-Selenium, Node-Puppeteer, Playwright, etc.) as long as it stays stable against FB's layout changes. Acceptance criteria • Near-real-time Discord alerts (5-10 min at least) containing the full listing URL, key specs, time since posted, and posted price • Adjustable filters/search profiles for make/model, year, mileage, price, fuel, etc • Error handling tha...
...BeautifulSoup, Selenium, Scrapy, or other efficient scraping frameworks. - The freelancer must handle pagination, dynamic loading, and scrolling where required. - Duplicate entries should be avoided. - The final dataset should be clean and well-structured. Output Format: The final deliverable should be provided in one of the following formats: - CSV - Excel - JSON The dataset must contain approximately 200,000 unique entries. Timeline: The project should be completed within 2 days from the start date. Additional Notes: - Data accuracy and proper formatting are very important. - The freelancer should ensure that the scraping process is efficient and capable of handling large-scale data extraction. Skills Required: - Web Scraping - Python - Selenium...
...with 3rd party captcha solvers (e.g., 2Captcha, DeathByCaptcha, or Anti-Gate). • Proxy Support: Support for rotating residential proxies to avoid IP blacklisting. • Notification System: Instant alerts via Telegram or WhatsApp when a slot is found or a booking is successful. • Multi-Account Support: Ability to manage multiple applicant profiles simultaneously. Technical Stack Preferred: • Python (Selenium, Playwright, or Puppeteer) • Experience with HTTP headers and request manipulation to mimic human behavior. • Knowledge of bypassing "Wait Rooms" or "Queues". Important Note for Applicants: Please do NOT send generic "mexc bot" or "crypto bot" templates. We are looking for someone who has specifically worked on...
valid email addresses harvested from a list I'll give you. Objective: outreach to sell books. You might use the stack you are most comfortable with Python + BeautifulSoup, Scrapy, Selenium, Node.js + Puppeteer, or similar 'as long as the final data is clean and deduplicated' Deliverables: • A CSV file containing each email address alongside the exact page URL where it was found • A brief note on the toolchain or script used (for reproducibility) Accuracy matters more than sheer volume Do Not give me a bloated list full of bounces.. I only neeed proper clean verified and contactable info.
I need a Python program to automate KYC information updates on the Oracle CCB website. The fields to be updated include: - Identity Information - Address Details - Contact Information The data for KYC updation will be provided via manual entry. Ideal Skills and Experience: - Proficient in Python - Experience with web automation (e.g., using libraries like Selenium) - Familiarity with Oracle CCB - Strong understanding of KYC processes and data handling - Ability to create user-friendly input interfaces for manual data entry Please ensure the solution is secure and reliable.
...Select city, exam and date Set number of parallel sessions --- Skill Requirements Strong experience with Playwright / Puppeteer (CDP) Deep understanding of cookies, redirects and session management Understanding of Wicket-based stateful applications Experience handling anti-bot and timing-based systems Backend + browser automation experience --- Important Note This is not a basic Selenium clicker job. Only developers experienced with strict session-based booking or ticketing systems should apply. Please send: 1. Past relevant work samples 2. Your approach to reconstructing redirect-based sessions 3. Estimated timeline and pricing --- Contact Begin your message with: “I understand the Wicket redirect chain.” I will then discuss the project d...
...and any email address the site provides. Because often hides data behind pagination or pop-ups, I expect a robust scraping approach that can handle dynamic content (Selenium, Playwright, or similar) as well as polite rate-limiting so we stay within acceptable request volumes. Deduplication is essential—if the same company appears under multiple categories or listings, merge the records instead of inflating the count. Deliverables • One clean .xlsx file containing all requested fields, ready for filtering and analysis • A brief text log explaining the scraping workflow, libraries used (e.g., Python–Selenium/BeautifulSoup, Node–Puppeteer, etc.), and any known data gaps • Confirmation that the crawl completed for every U.S. state an...
I have around 7,000 Aliexpress products that I need fully harvested for content-creation purposes. From each listing I only require the official product ...asset to its product URL or SKU will make downstream editing much easier for me. Deliverables: • Folder structure or archive segmented by product (one folder per listing). • Inside each folder: all JPEG images and any MP4 videos found. • A simple CSV mapping product URL → asset file names so I can trace anything quickly. I’m happy for you to choose the most efficient tooling—Python with Selenium, BeautifulSoup, or similar headless solutions are fine—as long as the final package is complete and safely transferable via cloud link. Let me know your estimated turnaround time and any clarifi...
I currently have a Python-based data scraper built with Selenium/Requests, but it is running too slow and crashing because of high memory usage while processing large datasets. It is not scalable. Requirements: Optimize memory footprint for 5,000+ records. Refactor sync loops to Asyncio/Aiohttp. Implement a robust error-handling and retry mechanism. Need an expert who can handle high-performance Python code. No beginners please.
I am looking a developer who can write a code or script to fill a website form in zero second and submit the same, it s normal form of details to book a playground so first person who submit can get the booking, people already doing this to submit it but we want to submit it before them, I can pay whatever amount it suitable for this work.
...service promoted, sponsorship tier (Gold, Silver, etc.), logo URL, website link, and any contact email that is publicly visible. Because many conference sites load content dynamically, the scraper must be able to render JavaScript when needed; a headless Selenium setup or an equivalent solution in Python 3 is fine. Please structure the code so I can add or remove conference URLs easily, and store all results in a clean CSV (UTF-8) or Excel file. Deliverables • Well-commented Python script (requests/BeautifulSoup for static pages, Selenium for dynamic ones) • and a short README describing setup and usage • One sample output file created from at least one conference URL I provide for testing I’ll review the job as complete once the script rel...
I need a way to secure ultra-short handles—only two- or three-character combinations. I’m not interested in longer names or other platforms; the entire focus is Twitter. W...the tool should: • Produce the full list of 2L and 3L possibilities. • Check availability in real time and mark claimed versus open usernames. • Let me save or export the open names (CSV or similar). • Run with minimal setup—Python, Node.js, or any language you prefer—as long as clear instructions are included. I’ll consider a custom script that taps Twitter’s API, browser automation (Selenium, Playwright), or another creative method as long as it meets the goals above. Please outline your approach, estimated turnaround, and any ongoing maint...
...full property details plus up-to-date owner contact information. will serve as a second source so the final list contains additional prospects pulled from that platform (matching or related to the same asset classes). All data must be scraped, deduped, and formatted in a single spreadsheet so I can sort, filter, and launch campaigns immediately. Use whatever stack you prefer—Python, Selenium, BeautifulSoup, Apify, or similar—but the workflow has to respect each site’s TOS and deliver reliable results. Deliverables • CSV/Excel file with Propstream property details and owner contacts for commercial buildings and apartments • Separate tab or merged columns with the corresponding leads • Basic data hygiene: no duplicates, valid phone/em...
...figures, in-car technology features, seating layouts and any other attributes exposed on the page. The end goal is a clean, analysis-ready Excel workbook that lets me run market-wide comparisons, so consistency is critical: headings must be standardised, units normalised and categorical values written the same way across the entire sheet. I am happy for you to use Python, Scrapy, BeautifulSoup, Selenium, AI-assisted extraction—whatever combination you trust—to pull the information, as long as the final file is accurate and complete. Data standardisation is essential. To keep things efficient I’d like a small sample delivered early so we can confirm structure before you harvest the full set. Once the sample is approved, scrape the remaining URLs, run your d...
...automatically, including date pickers and location selectors. • Rotate user-agents / proxies or apply any other anti-bot tactics necessary to stay undetected. • Capture and log errors so a failed request never silently drops a row. • Be easy for me to rerun on demand—command-line or small web UI is fine, as long as setup is straightforward. I’m comfortable with Python (BeautifulSoup, Selenium, Playwright, Scrapy, etc.) or another language you can justify, as long as you hand over all source code, dependency files, and a quick start README. A brief demo video or screenshots validating the scraper against at least one aggregator and one direct company site will serve as the final acceptance test. When you bid, point me to a project where you&r...
...• Navigate through all product listings on the site, follow pagination, and fetch fields such as product name, model/SKU, description, category, list price, discount price (if present), currency, and product URL. • Store the results in both CSV and JSON so I can easily import them into our internal tools. Technical expectations • Python 3.x with either Scrapy or BeautifulSoup/Requests; Selenium is acceptable only if the target pages rely heavily on JavaScript. • Respect and add polite throttling plus user-agent rotation to avoid blocking. • Code should be modular and ready for me to change the target domain or output path by editing a single config file. • Include a short README that explains how to install dependencies and run the scra...
... • If management has replied, both the reply text and the reply date At least half of the collected reviews must include a management response. The final deliverable is a clean, well-structured CSV that combines all fields. No language filtering is needed—capture reviews in any language exactly as they appear, keeping accents and special characters intact. You can use Python with Scrapy, Selenium, BeautifulSoup or whichever stack you prefer, provided you respect Tripadvisor’s loading patterns, pagination and dynamic elements so nothing is missed. Please maintain strict accuracy: no duplicated rows, correct hotel-to-review matching, and consistent field order. When you reply, attach or link to one or two brief samples from similar scraping projects (CSV sn...
...and regression scripts with clear pass / fail logs • A prioritized defect list (severity, steps to reproduce, environment, screenshots or screen-recordings) entered in my existing tracker or yours, then exported to PDF • A security report detailing vulnerabilities, evidence, and remediation advice • A final summary that highlights UX friction points and usability wins / gaps Automation (Selenium, Cypress) is welcome for repetitive flows, while API checks through Postman or similar tools will strengthen coverage. Once fixes land, please re-test and mark issues as verified. Timeline is flexible within reason, but I’d like an initial test plan draft within three days so we can align on scope. All documentation must be in English and structured so an inte...
...permissions, plus every notification or alert the system can generate. I will provide a detailed feature list once we start; for now assume you must touch every screen from the dashboard through final log-out. I expect you to explore the UI like a real user, probe edge-cases, document anything that breaks or confuses, and retest once fixes are applied. Feel free to use whichever tools you like—Selenium for scripted paths, Postman for API checks, screen-capture utilities for evidence—as long as the final report is crystal-clear. Deliverables • A structured bug report (steps to reproduce, environment, expected vs. actual, severity). • Screenshots or short clips for every issue. • A short usability note when a flow technically works but could frust...
...nationality, position, current club, height/weight (where listed), and any notable career highlights that appear on the player’s own site. Because these pages vary in structure, the code should be resilient: graceful error handling, user-agent rotation, and clear selectors or XPath rules that are easy for me to extend later. I’m comfortable running Python, so libraries like Requests, BeautifulSoup, Selenium, or Scrapy are welcome; please choose the stack that gives the best balance of speed and maintainability. Deliverable • A runnable script (with a brief README) • The resulting CSV generated from a short test run (5–10 players is fine for proof) • Comments in the code explaining each major step Acceptance • Script executes from the c...
...Entrance, Plate, Drink, Dessert, Coffee) but it can be only 2 stages out of 5 (example: Plate+Drink) Please also tag each record with its district and municipality so the file can be filtered regionally. Deliverables 1. A single CSV or Excel file containing one row per restaurant with all fields above clearly labelled. 2. The script or notebook you use (Python with BeautifulSoup, Scrapy, Selenium, or any other tool you prefer) so I can rerun the scrape later. Acceptance criteria • No duplicate restaurants. • All mandatory columns populated where data exists on Google. • At least 95 % of entries correctly classified for fixed-price menu status and pricing. Keep the approach respectful of Google’s terms of service. If you need to enrich the d...
...(Customer): • Android (Merchants): • iOS (Customer): Scope of work I need reliable automation that covers UI changes, UX-flow integrity and server-side performance. You are free to choose the stack—popular options such as Selenium + Appium for functional flows and JMeter, k6 or Locust for load tests are welcome—but explain your reasoning and keep the solution maintainable by our in-house developers after hand-over. Deliverables 1. End-to-end functional scripts that launch on Web, iOS and Android from a single command or CI trigger. 2. Visual regression setup (baseline screenshots and diff reporting). 3. Load-test
I need a high-speed automation tool for managing B2B leads. Requirements: - Detect new leads in real-time (within seconds) - Filter leads by: • Keywords (pharma products like ivermectin) ...leads. Requirements: - Detect new leads in real-time (within seconds) - Filter leads by: • Keywords (pharma products like ivermectin) • Buyer location (international preferred) • Quantity (bulk inquiries) - Perform instant automated actions on matching leads - Ensure very fast execution (within seconds of lead arrival) - Optional: automated response/message support Technical: - Python / Selenium / browser automation experience required Important: - The system should be efficient, fast, and reliable - Must simulate normal user behavior Please share your appro...
...all pages and pagination is essential. Please enter every entry in a separate column so we can analyze it easier. Deliverable • One .xlsx file containing a single sheet with the cleaned dataset As soon as the file opens without errors and every listed tractor is represented with the five fields correctly typed, the job is finished. If you already have experience with Python (BeautifulSoup, Selenium, Scrapy) or similar scraping tools and can turn this around quickly, let me know your timeframe....
...from a specific website and drops it into a neat, well-structured Excel workbook each month. The data points are the standard e-commerce essentials—name, price, SKU, description, availability and any variants that appear on the page. If the site nests details behind dynamic elements, please factor that in; I still expect a complete dataset. Your script can run in Python (BeautifulSoup, Scrapy or Selenium are all fine) or any language you prefer, as long as the final output is a tidy .xlsx file ready for analysis. I’ll trigger the run once a month, so the process should be repeatable with minimal manual tweaking—ideally a single command or scheduled task. Acceptance criteria • A working scraper that navigates pagination and captures every live product li...
...Project Scope: Scrape data from specified websites (details will be provided) Extract relevant fields (e.g., names, emails, prices, listings, etc.) Clean and structure the data (CSV, Excel, or database format) Ensure data accuracy and avoid duplicates Handle pagination, dynamic content, or login (if required) Requirements: Strong experience with web scraping tools (Python, BeautifulSoup, Scrapy, Selenium, etc.) Ability to handle anti-bot protections if needed Experience with data cleaning and formatting Attention to detail and reliability Nice to Have: Experience with automation and scheduling scripts Knowledge of APIs (if available instead of scraping) Deliverables: Clean, well-structured dataset Scraping script (optional but preferred) Documentation on how the data was c...
I need an experienced QA professional to put my web application through a rigorous round of Software Testing...behaviour, accessibility touch-points, and common security pitfalls (e.g., injection, authentication, session handling). • A clearly structured report listing each issue with severity, reproducible steps, environment details, and any relevant screenshots or short screen-captures. • Retesting of resolved defects to confirm fixes. I have no preference for a specific toolset; feel free to work with Selenium, Cypress, Postman, OWASP ZAP, or another stack you trust, so long as the findings are transparent and actionable. If you’re confident you can deliver an insightful test plan, thorough execution, and a report stakeholders can act on immediately, I&rsquo...
...CDN URLs * System should also optionally push data directly to WordPress via API --- ### 6. WordPress API Integration * Upload media and posters * Set alt tags automatically (format: title + “ - passthrough AR VR porn video - ”) * Create/schedule posts * Associate metadata, images, and tags correctly --- ## Technical Requirements * Python * Web scraping frameworks (Playwright, Selenium, or similar) * Media downloading (yt-dlp or similar) * API integration: BunnyCDN + WordPress REST API * Image processing (Pillow or similar) * AI/ML integration (OpenAI, local CLIP models, or similar) * Automation pipelines for multi-source input --- ## Deliverables * Fully working automation system * Clear documentation for setup and operation * Scalable to ~100 videos per mon...
...through an SMTP-level verifier (ZeroBounce, NeverBounce, or an in-house Python verifier—whichever you prefer, as long as it returns status codes for valid, invalid, catch-all, disposable, and role accounts). 4. Output only “valid” or “catch-all” emails in a downloadable CSV along with their metadata. • Technical notes - I’m comfortable with Python (Scrapy, Requests, BeautifulSoup, Selenium) or Node (Puppeteer, Cheerio); choose whichever stack you can scale and maintain. - Respect Google’s ToS with rotating residential proxies or a paid SERP API to avoid blocking and captchas. - The job should run via a daily cron or cloud function and log results to a lightweight dashboard (even a simple Flask/Express UI or Goo...
Hi, I need a data scraper who can scrap a data from provided sources. Skills : Core Technical Skills The freelancer should know Python (the most common scraping language) with libraries like Scrapy, BeautifulSoup, or Playwright. They should also be comfortable with browser automation tools like Selenium or Puppeteer (JavaScript-based), since sites like TipRanks are JavaScript-heavy and need a real browser to render. Anti-Bot Bypass Experience This is the most critical skill for your specific case. Look for someone experienced with handling CAPTCHAs (2Captcha, Anti-Captcha services), rotating proxies and residential IPs, spoofing browser headers and fingerprints, and bypassing Cloudflare or similar bot protection. TipRanks specifically uses these protections, so this experience ...
...automate data scraping from a series of publicly accessible web pages. The script should accept a list of URLs, navigate through any paginated content, extract the specified fields, and save the results to CSV and JSON. The task suits someone with an intermediate grasp of Python who is comfortable working with libraries such as requests, BeautifulSoup, pandas, or, when a site relies on JavaScript, Selenium or Playwright. Clear, well-commented code and concise setup instructions are essential so the script can be dropped into an existing workflow without modification. Acceptance criteria and deliverables: • Fully functional .py script that runs from the command line. • Configuration section (or .env file) for URL list and field selectors. • Output in both ...