Filter

My recent searches
Filter by:
Budget
to
to
to
Type
Skills
Languages
    Job State
    4,053 etl jobs found

    I’m in the start of a performan...and transformation – mapping existing schemas, writing efficient SQL, and using Geneva/Snowflake utilities to cleanse and reshape data so downstream analytics run faster. • Data loading and validation – building repeatable load jobs into Snowflake, designing row-level and aggregate checks, and documenting reconciliation so stakeholders can trust the numbers. Acceptance criteria • Clean, reusable ETL scripts committed to our Git repo. • Loaded Snowflake tables that match source counts and sample query results within agreed thresholds. • A short run-book describing load steps, validation logic, and any performance observations. If you’re comfortable juggling Snowflake, Geneva, and solid SQL while keeping p...

    ₹466 / hr Average bid
    ₹466 / hr Avg Bid
    3 bids

    I’m sitting on a large database that stores semi-structured records, and I need a robust transformation layer that turns this raw content into analysis-ready tables. The data is already captured and stored; the task begins once the records land in the database an...end-to-end. 3. Simple README explaining prerequisites, run steps, and how new fields should be added in future. Acceptance criteria • Pipeline processes at least 10 GB of source data without errors. • Output tables/files match the target schema I’ll provide and contain no missing or malformed records. • Execution can be parameterized for date ranges or incremental loads. If you’ve built similar ETL or ELT jobs against semi-structured data and can demonstrate performance at scale,...

    ₹14339 Average bid
    ₹14339 Avg Bid
    6 bids

    ...**Requirements:** * Proven experience with n8n (must have) * Strong understanding of APIs, webhooks, and automation logic * Experience working with LLMs (OpenAI, Claude, etc.) is a big plus * Ability to structure and process data efficiently * Problem-solving mindset and ability to work independently **Nice to have:** * Experience with web scraping tools * Familiarity with data pipelines and ETL processes * Basic backend knowledge (Node.js, Python, etc.) * Experience working with SaaS or data-driven products **Engagement details:** * Hourly contract * Ongoing work (multiple workflows to build) * Potential for long-term / full-time role **To apply:** Please include: 1. Examples of n8n workflows you’ve built 2. Your experience with API integrations and automation 3. An...

    ₹42644 Average bid
    ₹42644 Avg Bid
    171 bids
    Build Data Ingestion Pipeline
    4 days left
    Verified

    Senior Data Pipeline / ETL Engineer (FastAPI, PostgreSQL, OpenSearch) – Build MVP Data Ingestion Pipeline for Financial Intelligence Platform Overview We are building a financial intelligence infrastructure platform that aggregates global corporate registries, sanctions lists, and ownership data to produce compliance-grade investigative reports. The GitHub repository, architecture documentation, and core backend services already exist. What we need now is a senior data pipeline engineer who can complete the data ingestion and normalization pipeline so that our MVP feature works perfectly. The primary MVP workflow is: Search a person or company → resolve the entity → check sanctions exposure → reconstruct ownership relationships → produce an evidence-ba...

    ₹47765 Average bid
    ₹47765 Avg Bid
    141 bids
    Trophy icon Logo Redesign Challenge
    3 days left

    ...favicons or application icons). ## Main Issues with Current Logo - The icon loses clarity at small sizes - Some elements are too thin or too detailed - It does not work well as a favicon or app icon ## What We Need - A new icon/symbol, visually aligned with the current style - The logo should work when combined with additional words/products, for example: **Crono SQL** **Crono Analysis** **Crono ETL** **Crono Metadata** The icon must be: - Clean and recognizable at small sizes - Suitable for favicons, app icons, and UI usage - Keep consistency with the existing brand ## Design Style We are looking for a clean, modern, and minimal design, aligned with our current brand philosophy. ## Additional Work Adapt a few existing visual assets using the new logo ## Attachmen...

    ₹27281 Average bid
    Featured Guaranteed Sealed
    ₹27281
    1388 entries

    ...has a strict split: Backend / ETL (YOU BUILD) all calculations (jitter, jumps, gaps) scoring (0–100) classification (GREEN / YELLOW / RED) aggregations (1 min, 5 min, etc.) Grafana (YOU BUILD) dashboards charts tables filters alerts ❗ No business logic in Grafana ScopeETL / Backend process time-series data (x, y, z, timestamp) calculate: jitter (std deviation) jump detection visibility gaps build aggregated datasets: asset level device level zone level Grafana Build 3 dashboards: Overview (KPIs, problem assets, trends) Asset Detail (time series, scatter, histogram) Setup Dashboard (zone/device quality) Alerts unstable assets low visibility device issues Required Skills Grafana (advanced dashboards) InfluxDB (or similar time-series DB) SQL / Flux ETL / data p...

    ₹42458 Average bid
    ₹42458 Avg Bid
    22 bids

    ...stored as clear text (“Male”, “Female”, “Non-binary”, etc.). The scope is limited to customer information; contact details and transaction history are already handled elsewhere. Your job is to: • analyse the current values in the age and gender columns, • define a concise mapping that converts every existing variation into the target formats, • implement the transformation (SQL script, Python ETL, or comparable approach that fits with Salesforce, HubSpot, or a standard relational database), and • deliver a validation summary showing record counts before and after, plus any exceptions. Acceptance criteria: 100 % of active customer records display a numeric age and a readable gender string, with no loss of data and no...

    ₹3073 / hr Average bid
    ₹3073 / hr Avg Bid
    39 bids

    ...engineer with strong expertise in Apache Airflow, especially in dynamic DAG generation and scalable workflow design. Key Requirements: Strong hands-on experience with Apache Airflow Proven experience in dynamic DAG generation Solid experience working with AWS, especially MWAA (Managed Workflows for Apache Airflow) Good understanding of modern data/median architecture (data pipelines, orchestration, ETL/ELT workflows) Ability to design scalable, maintainable, and efficient workflows Experience with debugging and optimizing Airflow performance Nice to Have: Experience with data lakes / warehouses (e.g., S3, Redshift, Snowflake, etc.) Infrastructure as Code (Terraform/CloudFormation) CI/CD for data pipelines Project Scope: Design and implement dynamic DAGs Optimize existing Ai...

    ₹1024 / hr Average bid
    ₹1024 / hr Avg Bid
    6 bids

    ...automate the processes that slow teams down. With expertise across Data Engineering, Analytics, Advanced Analytics, and Process Automation, I work with startups, SMEs, and growing enterprises to build scalable data systems, actionable dashboards, and intelligent workflows that drive real business outcomes. Here’s what I bring to the table: Data Engineering - Design and build robust data pipelines (ETL/ELT) - Data warehousing (cloud & on-prem) - Data integration across multiple sources (CRM, marketing tools, databases) - Ensuring data quality, reliability, and scalability Analytics & Business Intelligence - Interactive dashboards (Power BI, Tableau, Looker, etc.) - KPI tracking, reporting automation, and executive dashboards - Funnel analysis, cohort analysis, an...

    ₹1955 / hr Average bid
    ₹1955 / hr Avg Bid
    14 bids

    I'm looking for assistance with basic financial analysis. Key tasks include: - Evaluating current financial status - Analyzing income, expenses, and savings - Providing insights for better financial decisions Ideal Skills and Experience: - Strong background in finance - Experience with financial analysis tools - Ability to present clear and actionable reports and editable excel file - ETL Looking for a freelancer who can deliver detailed yet understandable financial insights.

    ₹12756 Average bid
    ₹12756 Avg Bid
    81 bids

    ...building a product that relies on cutting-edge reinforcement learning and I need a hands-on engineer who can take the idea from raw data all the way to a production-ready service. The role is part-time and fully remote, but I’m looking for someone who treats ownership seriously and can commit to regular weekly check-ins. Here’s what the work looks like day-to-day: • Data preprocessing – design robust ETL pipelines, clean and transform large structured and unstructured datasets, and set up automated data validation. • Model development – research and implement reinforcement learning algorithms, experiment quickly, tune hyperparameters, and evaluate against clear success metrics. • Integration with existing systems – wrap trained m...

    ₹1955 / hr Average bid
    ₹1955 / hr Avg Bid
    123 bids

    ...Engineer for Training & Real-Time Project Guidance Description: I am looking for an experienced Azure Data Engineer who can provide hands-on training and guide me through real-time project scenarios. The focus should be on practical learning, including: - Building and managing data pipelines using Azure Data Factory - Working with Databricks, Spark, and Delta Lake - Data transformations, ETL workflows, and reporting automation - Handling operational reliability (failure paths, monitoring, alerts) - Best practices for scalable data solutions Requirements: - Strong experience in Azure Data Engineering tools (ADF, Databricks, Delta format, Power BI, SQL) - Ability to explain concepts clearly with real-world examples - Willingness to provide project-based ...

    ₹32309 Average bid
    ₹32309 Avg Bid
    16 bids

    ...standardising field names, removing duplicates, resolving obvious data-quality issues, and loading the final dataset into the destination I will specify once the mapping is confirmed. You will receive: • The original Excel files (multiple workbooks, mixed layouts). • A field-mapping template that shows where each column must end up. I expect: • A script or repeatable process (Python, Power Query, ETL tool of your choice) that performs the transformation. • A validated target data file or direct import into the agreed database/CRM, with no lost or corrupted records. • A brief hand-over note describing the steps so I can reproduce the migration in the future. Accuracy and data integrity are critical—every customer must move across exact...

    ₹34264 Average bid
    ₹34264 Avg Bid
    15 bids

    I want to spend one focused week mastering end-to-end data-engineering pipelines on Google Cloud. The heart of the bootcamp is hands-on practice: together we will build, deploy, monitor and troubleshoot real ETL flows until I can run them solo with confidence. Primary learning goal My priority is data ingestion, extraction, transformation and storage. We will start with the full ingestion toolkit—DataFlow, Pub/Sub and Cloud Storage—then chain that work into the wider GCP ecosystem: • Storage layers: BigQuery, Bigtable, Cloud Spanner, Cloud SQL, Datastore / Firestore • Transformation & orchestration: DataFusion, DataProc, Cloud Composer, Cloud Scheduler, Cloud Functions • Data quality & cataloging: DataPrep, Data Catalog • Visualisati...

    ₹29340 Average bid
    ₹29340 Avg Bid
    4 bids

    ...work is separate from data privacy initiatives and focuses on backend service development. Team Structure: Cyber Engineering team (~25 engineers) Role within the Identity & Privacy team Exposure to security concepts is a plus Required Skills: Google Cloud Platform (GCP) Scala (preferred) or strong Java (5+ years) Kubernetes (experience deploying services) Kafka (messaging & integration) ETL processes / data integration Strong Java developers willing to work with Scala are encouraged to apply. Responsibilities: Develop and support backend infrastructure services Work with distributed systems and messaging pipelines Deploy and manage services using Kubernetes Collaborate with engineering teams on integrations Engagement Details: Duration: Long-term (next...

    ₹1024 / hr Average bid
    ₹1024 / hr Avg Bid
    17 bids

    ...several cloud services—into a single, dependable business-intelligence layer. The specific mix of data-visualisation, reporting, or deeper analytics we use can be finalised once the data is flowing cleanly; my first priority is a rock-solid integration and modelling foundation. Here’s how I see the work unfolding. You’ll start by auditing the structure and quality of each source, then design an ETL/ELT approach that keeps everything in sync automatically. I’m open to whichever stack you feel is most suitable—Power BI, Tableau, Looker, an open-source solution, or something else—as long as it supports reliable refreshes, clear data lineage, and the ability to grow as new sources are added. Deliverables • Automated pipeline connecting the d...

    ₹3166 / hr Average bid
    ₹3166 / hr Avg Bid
    17 bids

    ...fluency with that trio is essential. Everything is written in modern Python (3.9+), surfaced through API Gateway, executed in Lambda and coordinated with Step Functions. Vector databases drive our semantic search, while Glue and Athena sit behind a serverless ETL pipeline that ingests both structured and unstructured data at scale. The contract runs for 12 months, fully remote, operating primarily on the IST time zone. Day-to-day you will: • Extend and maintain the Bedrock integrations and embeddings workflow • Design fault-tolerant ETL jobs in Glue, optimised for Athena queries • Tune vector search and retrieval strategies for latency and relevance • Harden our API layer for external consumption, including auth and rate limits • Contribu...

    ₹29609 Average bid
    ₹29609 Avg Bid
    24 bids

    ...dashboards, analytics, and AI-driven models. Roles we are looking for: • Data Analyst • Data Scientist • Business Analyst • Data Engineer • Tableau Developer • Power BI Developer • Machine Learning Scientist • AI Engineer • Quantitative Analyst Typical Responsibilities: • Building data dashboards using Power BI or Tableau • Data analysis and business insights • Data pipeline development and ETL processes • Integration of multiple data sources (CRM, ERP, marketing tools) • Development of predictive models and forecasting systems • Implementation of AI-driven analytics solutions Required Skills: • Strong experience with data analysis and visualization tools • Experience with SQL, Python, or R...

    ₹660801 Average bid
    ₹660801 Avg Bid
    143 bids

    AI develpment and AWS Cloud Expertise Hands-on experience with AWS services: S3, EC2/ECS, IAM, VPC, CloudWatch Delivered data pipeline/ETL projects (Glue, Athena, Redshift, or similar) Delivered AI/ML or GenAI projects (SageMaker, Bedrock, or similar) CI/CD, DevOps practices, and cloud-native architecture experience AWS certification is a plus Application Requirements Describe a specific project you delivered (cloud, AI, data, or SaaS) List AWS services used Send a professional resume includge project with live project.

    ₹49907 Average bid
    ₹49907 Avg Bid
    36 bids

    We are looking for a highly motivated Sales / Business Development Executive who can help us acquire new clients and bring data engineering, analytics, and cloud projects. We are a growing data engineering company — Mars Matrix Tech Website: What We Do: We specialize in: Data Engineering (Azure, AWS, Databricks, Spark) ETL / ELT Pipeline Development Data Warehousing & Lakehouse Architecture Real-time Streaming (Kafka, Event Hub) Data Quality & Governance AI/ML & LLM-based solutions Your Role: You will be responsible for: Finding and approaching potential clients (Upwork, LinkedIn, cold outreach, etc.) Understanding client requirements Pitching our services Bringing qualified leads/projects Closing deals or setting up meetings Compensation: Commission-Base...

    ₹6238 Average bid
    ₹6238 Avg Bid
    7 bids

    I’m in the start of a perfo...transformation – mapping existing schemas, writing efficient SQL, and using Geneva/Snowflake utilities to cleanse and reshape data so downstream analytics run faster. • Data loading and validation – building repeatable load jobs into Snowflake, designing row-level and aggregate checks, and documenting reconciliation so stakeholders can trust the numbers. Acceptance criteria • Clean, reusable ETL scripts committed to our Git repo. • Loaded Snowflake tables that match source counts and sample query results within agreed thresholds. • A short run-book describing load steps, validation logic, and any performance observations. If you’re comfortable juggling Snowflake, Geneva, and solid SQL while keeping p...

    ₹679 / hr Average bid
    ₹679 / hr Avg Bid
    3 bids

    ...migration Maintain relationships between records (e.g. job history linked to customers) Map Simpro data structure correctly into Odoo modules Assist with validation and testing post-migration Provide support during go-live if required Key Requirements: Demonstrable experience with Odoo (essential) Previous experience migrating data from Simpro (highly desirable) Strong understanding of data mapping, ETL processes, and database structures Ability to work quickly and meet a tight 7-day deadline Clear communication and daily progress updates Nice to Have: Experience with field service or contractor-based businesses Familiarity with both Simpro and Odoo APIs Ability to troubleshoot and resolve migration issues quickly Timeline: Start: Immediate Deadline: Within 7 days To Apply...

    ₹142827 Average bid
    ₹142827 Avg Bid
    116 bids

    ...a streamlined, largely automated flow that boosts overall efficiency by giving us clean, reliable numbers the moment we need them. Key deliverables • Audit the current sources (spreadsheets, Harvest/Jira exports, etc.) and pinpoint the exact bottlenecks. • Design and build a central data model—PostgreSQL, Airtable, or a tool you recommend—that pulls records automatically. • Develop the ETL or low-code automations (Python, Zapier/Make, REST APIs) that remove every copy-paste step. • Produce a lightweight dashboard or pivot-style report showing hours by person, project, and date range, refreshable in under one minute. • Supply clear hand-over docs and a brief video walkthrough so the team can maintain the system without you. Idea...

    ₹21601 Average bid
    ₹21601 Avg Bid
    4 bids

    ...Databricks notebooks and workflows to support new datasets and reporting requirements. Support orchestration of data pipelines through Insight Factory (ADF-based). Develop and maintain Power BI dashboards, reports, and data models to support business reporting needs. Collaborate with the wider team to address challenges in delivering scalable and reliable BI solutions. ETL Development: • Design and develop ETL processes to integrate and transform data from various sources, with core transformation logic implemented in Databricks. • Adapt and extend existing pipelines and notebooks to support new datasets and reporting requirements. Data Modelling & Preparation: • Develop and maintain optimised data tables and views to support reporting and...

    ₹129170 Average bid
    ₹129170 Avg Bid
    31 bids

    ...Azure-based applications, with a strong emphasis on security, scalability, and performance. • Develop comprehensive data modeling strategies, ensuring high-quality, well-documented, and easily maintainable data structures. • Collaborate with engineering, analytics, and business teams to define and enforce data governance, security standards, and architectural best practices. • Architect and optimize ETL/ELT pipelines using Azure-native tools (e.g., Azure Data Factory, Databricks, Synapse Analytics) to meet the evolving needs of the organization. • Implement and maintain security strategies including data encryption, access controls, and compliance with data protection regulations (e.g., GDPR). • Lead cloud migration and modernization initiatives, ens...

    ₹4935 / hr Average bid
    ₹4935 / hr Avg Bid
    17 bids

    ...minimal but functional) ⸻ NON-NEGOTIABLE SKILLS You must have real experience in: Core Engineering • Python (advanced) • API development (FastAPI or Flask) • Working with large datasets AI / Data • Entity resolution / fuzzy matching • Embeddings (e.g. OpenAI, sentence transformers) • Basic machine learning or scoring systems Data Infrastructure • PostgreSQL or similar database • Building ETL/data pipelines Systems / Architecture • Ability to design a simple but scalable architecture • Experience deploying applications (AWS, GCP, or similar) ⸻ BONUS (STRONGLY PREFERRED) • Experience with graph databases (Neo4j) • Experience with financial, AML, or KYC datasets • Experience building search system...

    ₹107635 Average bid
    ₹107635 Avg Bid
    173 bids

    ...reruns should update records rather than insert duplicates. • Media handling: verify every saved image/video URL is reachable. Deliverables 1. PostgreSQL schema (DDL) plus a populated sample dump covering at least one full auction day from each site. 2. Scraper/ETL code with instructions to schedule it (cron, systemd, or Docker). 3. README documenting setup, environment variables, and expected runtime. 4. Quick validation script that returns the latest live bid for a given lot ID. Acceptance criteria • Running the ETL end-to-end on my server populates all specified fields with no missing columns. • Querying the same lot twice a minute during a live auction shows changing bid amounts within 10 seconds of the website. • All image/vid...

    ₹24395 Average bid
    ₹24395 Avg Bid
    51 bids

    ...pipelines • Integrate AI models with existing backend and data infrastructure • Enhance dashboard UX with intelligent visualizations • Ensure accuracy, reliability, and security of analytics outputs • Collaborate with product and engineering teams for feature rollout Technical Requirements: • Strong experience in Python / AI frameworks (TensorFlow, PyTorch, or similar) • Experience with data pipelines, ETL, and large-scale analytics • Familiarity with SaaS architectures and cloud platforms (AWS / GCP / Azure) • Experience building production-ready ML features • Knowledge of analytics visualization tools (e.g., Power BI, Tableau, or custom dashboards) • Ability to optimize model performance and reduce latency Nice to Have: •...

    ₹12942 Average bid
    ₹12942 Avg Bid
    22 bids

    All of my operational data currently lives in several Excel workbooks, and I’d like to see it come to life in a single, interactive Power BI dashboard. I already have the core tables prepared, but they need to be modelled, related, and enhanced with a handful of straightforward measures (think averages, year-to-date totals, and simple variance percentages). No complex ETL work is required—just clean connections to the spreadsheets, light DAX, and a polished visual layer that highlights the efficiency metrics I care about. Here’s how I picture delivery: • A .pbix file with clearly named tables, relationships, and measures • Filters and drill-throughs that let me segment results by date, process, and location • A brief walk-through—or scre...

    ₹1490 Average bid
    ₹1490 Avg Bid
    13 bids

    I need a compact proof-of-concept that automatically extracts Documents from an Azure Blob container, captures their accompanying metadata, and lands both in a SharePoint Online document library. Only files classed as “Documents” (e.g., PDF, DOCX, XLSX) are in scope—no images or videos. The PoC must demonstrate that file contents arrive intact, their metadata is mapped to the correct SharePoint columns, and the process can be triggered on demand or on a simple schedule. SharePoint Migration API is the required method. The key is clarity and repeatability; hard-coded secrets or one-off manual steps won’t meet the goal. Deliverables • Source code definition with clear README covering setup, configuration variables, and execution steps • A brief demo (s...

    ₹62284 Average bid
    ₹62284 Avg Bid
    130 bids

    ...workflows Tech Stack (Flexible) Preferred experience in: Python (data processing and pipelines) GIS tools (GDAL, raster/vector processing, geospatial data handling) Simulation frameworks (CesiumJS, Unity, Unreal Engine, WebGL, or similar) Experience with LiDAR or geospatial visualization Nice to Have: Experience with environmental or remote sensing data Experience designing data pipelines (ETL workflows) Cloud compute (AWS, GCP, etc.) Deliverables Functional simulation pipeline (MVP) Data ingestion and preprocessing scripts UAV simulation with configurable flight paths Exportable dataset format (structured and documented) Important This is a research-oriented project. More detailed specifications, datasets, and implementation details will be shared after initial...

    ₹7290 Average bid
    ₹7290 Avg Bid
    27 bids

    ...Background) Duration: 3 Hours Budget: ₹32000 (per month) Job Description: We are looking for a Python AI Engineer with a strong Data Engineering and AI background for a short-duration engagement. The ideal candidate should have hands-on experience in building data pipelines and implementing AI/ML solutions using Python. Key Requirements: Strong proficiency in Python Solid experience in Data Engineering (ETL pipelines, data processing, data workflows) Hands-on experience with AI/ML models and frameworks Experience with data handling, preprocessing, and model deployment Familiarity with tools like Pandas, NumPy, Scikit-learn, TensorFlow/PyTorch Ability to deliver within a 3-hour engagement window Preferred Skills: Experience with cloud platforms (AWS/GCP/Azure) Knowledge of LLMs ...

    ₹27840 Average bid
    ₹27840 Avg Bid
    22 bids

    I need a small web application that sits on IIS a...first version and quietly ignore the duplicates. • The detailed report has to be easy to read on the web page and exportable to Excel or CSV for further analysis. It should spell out what changed, by column, plus show new and missing rows. Deliverables • IIS-ready ASP.NET (Core or Framework) application, source code included. • Normalised SQL Server schema and any required stored procedures or ETL scripts. • Configuration file or admin screen for setting the watch folder path and scheduled scan interval. • Deployment and setup guide so I can reproduce your environment on our test server. Once the core workflow is proven, I’m open to light UI polish, logging, or scheduled email of the report, but the...

    ₹45813 Average bid
    ₹45813 Avg Bid
    1 bids

    I need a small web application that sits on IIS a...first version and quietly ignore the duplicates. • The detailed report has to be easy to read on the web page and exportable to Excel or CSV for further analysis. It should spell out what changed, by column, plus show new and missing rows. Deliverables • IIS-ready ASP.NET (Core or Framework) application, source code included. • Normalised SQL Server schema and any required stored procedures or ETL scripts. • Configuration file or admin screen for setting the watch folder path and scheduled scan interval. • Deployment and setup guide so I can reproduce your environment on our test server. Once the core workflow is proven, I’m open to light UI polish, logging, or scheduled email of the report, but the...

    ₹461825 Average bid
    ₹461825 Avg Bid
    1 bids

    ...should appear. I am trying to view historical metrics—specifically call duration, number of calls handled, and customer-satisfaction score—but every query comes back empty for every single agent on the platform. The data exists in real time and the front-end dashboards formerly worked, so I suspect the problem sits somewhere between the CTI data store and the historical reporting layer (indexes, ETL schedule, or permissions). What I need from you • Pinpoint the root cause of the missing historical records. • Restore reporting so the three metrics above populate correctly for all agents. • Provide a brief document of the fix (SQL scripts, config changes, or scheduling adjustments) so the team can reproduce it in the future. • Validate the ...

    ₹4749 Average bid
    ₹4749 Avg Bid
    6 bids

    I need a small web application that sits on...version and quietly ignore the duplicates. • The detailed report has to be easy to read on the web page and exportable to Excel or CSV for further analysis. It should spell out what changed, by column, plus show new and missing rows. Deliverables • IIS-ready ASP.NET (Core or Framework) application, source code included. • Normalised SQL Server schema and any required stored procedures or ETL scripts. • Configuration file or admin screen for setting the watch folder path and scheduled scan interval. • Deployment and setup guide so I can reproduce your environment on our test server. Once the core workflow is proven, I’m open to light UI polish, logging, or scheduled email of the report, but the...

    ₹2514 / hr Average bid
    NDA
    ₹2514 / hr Avg Bid
    140 bids

    ...building RAG systems Knowledge of chunking strategies for LLM optimization Experience with LangChain, LangGraph, or similar orchestration tools Familiarity with AI monitoring, observability, and evaluation frameworks Experience building agent-based workflows or AI automation Engineering Experience Experience building microservices and scalable systems Strong knowledge of data pipelines and ETL processes Experience designing and optimizing databases and data models Additional Skills Strong understanding of ML concepts and NLP techniques Ability to work with ambiguous problems and rapidly evolving AI tools Experience with modern software development practices (Git, testing, CI/CD, code reviews) Engagement Details Location: Remote (EU-based freelancers only) Contract:...

    ₹1360149 Average bid
    ₹1360149 Avg Bid
    110 bids

    I have a compact Microsoft Access database containing fewer than ten tables that I need moved cleanly into SQL Server. Along with moving the rows themselves, I must keep every primary-key and foreign-key relationship intact so the data model behaves exactly the same once it lands in SQL Server. You are free to use SQL Server Migration Assistant, SSMS scripts, or your own ETL workflow—as long as the end result is a set of SQL Server tables whose structure and links mirror what I have in Access. Acceptance criteria • All tables (1–10 in total) exist in SQL Server with matching column names and data types suited to that platform. • Every existing PK/FK constraint in Access is recreated and validated on the SQL Server side. • Row counts between sou...

    ₹2980 Average bid
    ₹2980 Avg Bid
    22 bids

    ...be consolidated and parameterised, with an eye toward incremental refresh and reliability through the on-premises gateway once everything is published to the Power BI Service workspace. Advanced DAX, solid M language skills, and a working knowledge of SQL objects are crucial because some changes may require tweaks to underlying views or stored procedures. Experience with dimensional modelling, ETL practices, and data-visualisation best practices will help you spot additional performance wins and usability improvements. Please focus your application on relevant experience optimising large operational datasets in Power BI; links or screenshots are welcome if confidentiality allows. Deliverables • A redesigned PBIX with documented tables, relationships, and measures •...

    ₹14339 Average bid
    ₹14339 Avg Bid
    61 bids

    ...glance how well we are moving product from warehouse to customer. The focus is delivery progress, so the visuals must surface three core metrics: on-time deliveries, route efficiency, and customer satisfaction scores. Data sources are already available in our internal SQL database and a daily CSV export from our last-mile partner. I can supply sample tables and field definitions; you will handle the ETL so the numbers refresh automatically each morning. A browser-based tool such as Power BI Service, Tableau Online, or a lightweight React-D3 build is fine—whichever you are fastest with. Key expectations • Single-page overview with KPI cards, trend lines, and a map or heat-grid for route efficiency • Filters for date range, region, and driver • Drill-do...

    ₹22346 Average bid
    ₹22346 Avg Bid
    24 bids

    ...payroll tables). Here is what matters to me: • Source: Lawson (on-premise LSF, version 10) • Scope: Financial data tied specifically to payroll—nothing HR-only or attendance-related. • Delivery: A single Excel file, cleanly formatted and ready for pivoting or upload into Power BI. You may use whatever extraction method you are most comfortable with—Lawson SQL queries, the Lawson API, or an ETL tool such as Informatica or SSIS—as long as the process is repeatable. Document each step briefly so my internal analyst can rerun the pull next month. Acceptance criteria • All requested payroll fields present and populated. • No duplicate rows or orphaned records. • Refresh instructions validated on a second machine. If you ...

    ₹43669 Average bid
    ₹43669 Avg Bid
    46 bids

    ...glance how well we are moving product from warehouse to customer. The focus is delivery progress, so the visuals must surface three core metrics: on-time deliveries, route efficiency, and customer satisfaction scores. Data sources are already available in our internal SQL database and a daily CSV export from our last-mile partner. I can supply sample tables and field definitions; you will handle the ETL so the numbers refresh automatically each morning. A browser-based tool such as Power BI Service, Tableau Online, or a lightweight React-D3 build is fine—whichever you are fastest with. Key expectations • Single-page overview with KPI cards, trend lines, and a map or heat-grid for route efficiency • Filters for date range, region, and driver • Drill-do...

    ₹1676 Average bid
    ₹1676 Avg Bid
    11 bids

    I already have Zoho DataPrep up and connected to my Zoho One account. The legacy records are largely well-structured, yet they still carry old duplicates, stray values, and obsolete pick-lists that must be scrubbed before the switch to m...the DataPrep workspace plus a copy of the target CRM field map. When we’re done I expect: 1. An import-ready dataset free of duplicates or formatting issues. 2. The saved DataPrep recipe so I can rerun it if new data arrives. 3. A brief hand-off note summarising any assumptions, formulas, or data quality flags. If you have real-world experience cleaning data inside Zoho DataPrep—or similar ETL tools such as Talend, Power Query, or Informatica—this should be straightforward. Please reference past Zoho DataPrep work so I know...

    ₹14432 Average bid
    ₹14432 Avg Bid
    83 bids

    I’m looking for someone who has already moved Microsoft SSIS workloads into Google Cloud and can guide me through a clean, production-ready rebuild of our existing pipelines. Right now the jobs pull from a mix of relational databases and nightly flat-file drops; the goal is to land everything in BigQuery with the same—or better—performance, data quality, and monitoring we have today. Here’s what I need from you: • Assess the current SSIS packages, map out every source table and file feed, and document the equivalent flow on GCP. • Design and build the new pipelines using native services (Dataflow, Cloud Data Fusion, Cloud Composer, or any combination you feel is best) with full error handling, logging, and retry logic. • Automate scheduling and...

    ₹49814 Average bid
    ₹49814 Avg Bid
    41 bids

    I’m preparing a migration from our Sapiens ID3 platform into the FAST Technology system and need a precise field-by-field mapping created. The source data centers on policy records, and I must be sure that every element tied to the customer—name and contact details, policy-specific attributes, and related transaction history—lands in th...transformation or new target field is required. Deliverables • A comprehensive mapping document (spreadsheet or equivalent) showing source field, data type, description, target field, and required transformations. • A brief summary of any assumptions or open questions that need business input. I’m looking for accuracy over volume: once I validate the mapping, we’ll move on to building the actual ETL, ...

    ₹4935 / hr Average bid
    ₹4935 / hr Avg Bid
    20 bids

    ...from multiple sheets or external CSV/XLSX files into a single, structured source of truth. 3. Automated reporting – generate dashboards in Google Sheets (or connected Looker Studio if you prefer) that refresh automatically and show at a glance: shipment delivery times, order-fulfillment rates, and current inventory levels. Deliverables: • One master sheet with validation logic in place • ETL or scripted connections for all incoming data sources • A fully formatted dashboard tab with the three KPIs above, filterable by date, destination, and carrier • Brief hand-off notes or a quick loom explaining how to maintain the setup If you’ve built logistics dashboards before and are comfortable with formulas, QUERY, ARRAYFORMULA, pivot tables, A...

    ₹12756 Average bid
    ₹12756 Avg Bid
    52 bids

    ...our treasury platform. I am comfortable sharing a sample dataset so you can map the fields, clean anomalies, and transform the numbers. Ideally the workflow is automated—SQL or Python for the heavy lifting and Excel, Power BI or Tableau for the final dashboards and printable summaries—but I am open to whichever stack you can justify as robust and maintainable. Acceptance criteria • A scripted ETL routine that I can run on new deal files without rewriting code. • Interactive dashboards plus a one-page PDF summary highlighting ROI, volume trends, and daily P&L. • Brief hand-off notes so my team can adjust data paths or field names in future. If you have questions about the data structure or need a quick walkthrough of the treasury trading ...

    ₹1676 Average bid
    ₹1676 Avg Bid
    12 bids

    Title: Senior dMRV / Carbon Data Engineer (Verra + – Palm Oil / Biochar) Scope: We are an agentic + Web3 dMRV platform. We need a freelancer to assist prepare our data and calculations. Your tasks: Review historical data of POME SCADA exports. Map SCADA fields to Verra VCS POME/biogas methodology requirements and, at a high level, to biochar requirements. Build Python/SQL ETL scripts to clean and structure the data (handle gaps, outliers, units). Implement preliminary carbon calculations (CH₄ avoided / destroyed, EFB→biochar tCO₂e stored) using standard factors. Deliver a short technical memo (10–15 pages) describing data, assumptions, and methods that we can later refine with VVB. Must‑have experience: Hands‑on data work (Python + SQL) with time‑series /...

    ₹16574 / hr Average bid
    ₹16574 / hr Avg Bid
    26 bids

    The Workday technical consultant will design and implement scalable Workday data ingestion pipelines and provide recommendations on best practices for extracting and integrating Workday data across hyperscaler environments (Azure, AWS, or GCP). Integration Approaches Under Evalu...(Workday Query Language) RaaS has been the traditional approach but is not preferred for future implementations. Key Requirements 5+ years of Workday technical experience (integrations, reporting, or data extraction). Proven experience implementing Workday data ingestion to cloud data platforms. Strong understanding of REST/SOAP APIs, RaaS, and WQL. Experience working with modern data platforms and ETL/ELT pipelines. Ability to recommend optimal ingestion strategies based on scalability, performance, and g...

    ₹931 / hr Average bid
    ₹931 / hr Avg Bid
    5 bids

    Top etl Community Articles