Snowflake jobs
Please Sign Up or Login to see details.
...real-time and batch processing. · Hands-on with Airflow, dbt, or other orchestration tools. · Familiarity with data modeling (OLAP/OLTP), schema evolution, and format handling (Parquet, Avro, ORC). · Experience with hybrid/on-prem and cloud platforms (AWS/GCP/Azure) deployments. · Proficient in working with data lakes/warehouses like Snowflake, BigQuery, Redshift, or Delta Lake. · Knowledge of DevOps practices, Docker/Kubernetes, Terraform or Ansible. · Exposure to data observability, data cataloging, and quality tools (e.g., Great Expectations, OpenMetadata). Shape Good-to-Have · Experience with time-series databases ...
Results-driven Senior Data Analyst with 8+ years delivering enterprise data solutions across Banking, Financial Services, and Healthcare. Specialized in end-to-end Data Warehouse design, ETL pipeline development, and BI reporting. Core Expertise: Snowflake, Azure Data Factory, Azure Synapse, Delta Lake, PySpark, SSIS, T-SQL, PL/SQL, Oracle, MySQL, Power BI, DAX, Tableau, SSRS, Star/Snowflake Schema, EDW Design, Data Mart Development, Data Lineage, Gap Analysis. Compliance & Governance: HIPAA, GDPR, SOX Audit Controls, Data Quality Frameworks, Data Governance Policies. Business Analysis: BRD/FRD Writing, JAD Facilitation, UML Diagrams, Stakeholder Management, Agile, Waterfall. Certifications: Microsoft Azure Fundamentals, Salesforce Administrator, Salesforce Platform Devel...
I’m bringing a new line of smartphone-compatible snow gloves—Smart Gloves—to market and need a logo that instantly communicates “winter tech” without feeling busy. The look I’m after is distinctly modern, built around neutral black and grey tones so it holds up on everything from retail packaging to a stitched label. Key visual cues I want blended into a single, cohesive mark: a snowflake to signal cold-weather use, a clear glove shape, and a subtle smartphone reference that hints at touch-screen compatibility. Think clean geometry and smart negative space rather than literal clip-art mash-ups. Because the logo will appear online, in print, and embroidered on fabric, it must remain crisp and recognisable at large and very small sizes. Smooth sca...
I’m rebranding City Line HVAC and need a striking, modern‐and‐sleek logo that instantly says “New York” while highlighting the dual nature of our work—heating and cooling. Please make the NYC skyline the star of the composition, then weave in a snowflake and a flame so visitors immediately recognize our full service range. The company name, City Line HVAC, should feel fully integrated rather than tacked on; I’m looking for a design that feels purpose-built, unique, and contemporary. I’m open on color for now, so feel free to experiment as long as the final mark stays clean and professional. Once we agree on a direction, I’ll need: • At least three initial concepts with one refined to completion after feedback • Final artwork su...
We need an experienced MuleSoft developer to work on integration solutions connecting various systems and databases. Requirements: • 4+ years of experience with MuleSoft Anypoint Platform/Studio • Strong experience connecting MuleSoft APIs with Azure SQL, Azure Blob, and Snowflake • Proficiency with REST and SOAP web services • Strong working knowledge of JSON and XML data formats • Experience with Agile and waterfall project management methodologies • Excellent problem-solving and analytical skills • Strong communication and presentation abilities • Must be available to work 12pm - 9pm IST, Monday through Friday • Immediate availability or maximum 15-20 days notice period Deliverables: • Design and develop MuleSoft integration sol...
...effort that moves our current datasets into Snowflake, with Geneva and straight SQL powering the pipelines. To keep momentum, I need someone who can help on the hands-on analyst work while thinking like a Business Data Analyst. What I still have to finish centres on two areas: • Data extraction and transformation – mapping existing schemas, writing efficient SQL, and using Geneva/Snowflake utilities to cleanse and reshape data so downstream analytics run faster. • Data loading and validation – building repeatable load jobs into Snowflake, designing row-level and aggregate checks, and documenting reconciliation so stakeholders can trust the numbers. Acceptance criteria • Clean, reusable ETL scripts committed to our Git repo. • Loade...
...transformed results are written back to a target schema (or files, if that proves more efficient). Key points you should know • Source: relational database containing nested JSON / key-value blobs. • Goal: parse, normalize, and flatten these blobs into well-defined columns while preserving relationships and lineage. • Scale: millions of rows, so solutions that leverage Spark, Hadoop, BigQuery, Snowflake, or well-tuned SQL/Python pipelines are welcome—as long as they remain maintainable. Deliverables 1. Transformation code (Python, PySpark, SQL, or Scala) with clear comments. 2. A runnable job definition or workflow file (Airflow DAG, Spark submit script, dbt model, etc.) that shows how to execute the pipeline end-to-end. 3. Simple README explain...
...Dashboard Enable proactive monitoring of data integrity Help implement processes for data quality management and governance Required Skills: Advanced SQL Power BI (dashboard development) Snowflake Jira Strong understanding of Data Governance concepts Preferred: Experience in data quality frameworks and monitoring Experience working with cross-functional data teams Engagement Details: Support Type: Daily job support Duration: Ongoing Hours: 1–2 hours per day Mode: Remote Timings: Flexible (early morning or late evening preferred) When Applying, Please Include: Relevant experience (SQL / Power BI / Snowflake) Experience in data quality / governance Sample dashboards or project details (if available) Your hourly rate Availability We are looking to start immedia...
We are looking for an experienced data engineer with strong expertise in Apache Airflow, especially in dynamic DAG generation ...Solid experience working with AWS, especially MWAA (Managed Workflows for Apache Airflow) Good understanding of modern data/median architecture (data pipelines, orchestration, ETL/ELT workflows) Ability to design scalable, maintainable, and efficient workflows Experience with debugging and optimizing Airflow performance Nice to Have: Experience with data lakes / warehouses (e.g., S3, Redshift, Snowflake, etc.) Infrastructure as Code (Terraform/CloudFormation) CI/CD for data pipelines Project Scope: Design and implement dynamic DAGs Optimize existing Airflow workflows Set up / improve MWAA environment Provide best practices for scalable data pipeline ar...
...workflows and pipelines (dbt, SQL). Analyze datasets and communicate findings for data-driven decisions. Work with LLM benchmarking and agentic coding workflows. What You Need 4+ years professional experience in DS, ML, or Data Engineering. Expert Python (pandas, NumPy, scikit-learn) and SQL. Proven ability to diagnose ML failure modes and improve model quality. Familiarity with cloud warehouses (Snowflake, BigQuery, or Redshift). Why this role? This is a flexible, remote engagement perfect for those looking to contribute to cutting-edge AI research and work with top-tier industry labs without the commitment of a full-time product role....
...should have strong hands-on experience in backend development and data engineering, and be comfortable supporting in a real-time project environment. Core Technologies: Python (backend services & data transformation) Kafka / Confluent (event-driven architecture) JSON (data modeling & payload handling) MongoDB (data storage, audit, processing state) Kubernetes (deployment & service management) Snowflake (reporting & analytics) Facets (claims processing system) Machinify (payment integrity platform) Cloud & Tools: AWS Docker CI/CD pipelines GitLab / Git workflows Exposure to EKS / ECS / Lambda / S3 / RDS / DynamoDB is an added advantage Responsibilities: Provide ongoing job support to the client Assist in debugging, issue resolution, and enhancements Support...
I am putting together an end-to-end data architecture that can reliably ingest, store, and serve a broad range of clinical-trial assets: patient demographics, clinical trial results, genomic data, COA (Clinical Outcomes Assessment) records, and a growing rater database. What I need from you Design the target architecture and implement the core pipelines—ideally using a modern cloud stack (Snowflake, Databricks, BigQuery, Redshift, or a similar platform; feel free to propose the best fit). Your work should cover raw-to-curated layers, automated metadata capture, and role-based access controls that satisfy typical GxP and HIPAA expectations. Key deliverables • Reference architecture diagram with component rationale • Re-usable ingestion and transformation co...
...an experienced Snowflake-focused Data Engineer / BI Consultant to provide job support (1–2 hours per day) for a client working on data quality and BI initiatives. The primary focus is on identifying data quality issues and supporting end-to-end resolution in collaboration with business data stewards, data engineering, and reporting teams. Key Responsibilities: Identify and troubleshoot data quality issues in Snowflake Support end-to-end resolution of data discrepancies Work closely with data stewards, engineering, and reporting teams Assist in building a BI Quality Assurance Dashboard (Power BI) Enable proactive monitoring of data integrity Support implementation of data governance and quality processes Required Skills (Must Have): Strong hands-on experience in ...
...Dashboard Enable proactive monitoring of data integrity Help implement processes for data quality management and governance Required Skills: Advanced SQL Power BI (dashboard development) Snowflake Jira Strong understanding of Data Governance concepts Preferred: Experience in data quality frameworks and monitoring Experience working with cross-functional data teams Engagement Details: Support Type: Daily job support Duration: Ongoing Hours: 1–2 hours per day Mode: Remote Timings: Flexible (early morning or late evening preferred) When Applying, Please Include: Relevant experience (SQL / Power BI / Snowflake) Experience in data quality / governance Sample dashboards or project details (if available) Your hourly rate Availability We are looking to start immedia...
I need ongoing, hands-on help with DBT. As a beginner, my bigge...beginner, my biggest hurdle is getting a clean project up and running and then writing reliable SQL for daily transformations that will run against Snowflake. Here’s what I’m hoping you can guide me through: • Writing and optimizing transformation SQL inside models, including incremental logic, tests, and documentation • Ad-hoc troubleshooting in my existing pipelines so I understand why something fails and how to fix it next time Most of the work will be pairing sessions—screen-sharing while we walk through real tickets from my day job—so strong teaching skills are as important as technical depth. If you’re confident with DBT and Snowflake and enjoy mentoring,...
I need a small, battle-tested Rust program that can pull structured data—full tables and selected columns—from Snowflake and push it straight into an Excel-compatible spreadsheet consumed by our proprietary tool. The transfer has to feel real-time: as soon as new rows appear in Snowflake, the sheet should refresh without manual triggers. Core workflow • Authenticate with Snowflake using a service account and keep the session alive efficiently. • Listen or poll for changes at low latency, then stream the fresh rows through the Rust layer. • Generate or update an .xlsx file (or another format you recommend that Excel opens natively) so the proprietary front end always shows the latest snapshot. • Keep memory footprint minimal and ...
...effort that moves our current datasets into Snowflake, with Geneva and straight SQL powering the pipelines. To keep momentum, I need someone who can help on the hands-on analyst work while thinking like a Business Data Analyst. What I still have to finish centres on two areas: • Data extraction and transformation – mapping existing schemas, writing efficient SQL, and using Geneva/Snowflake utilities to cleanse and reshape data so downstream analytics run faster. • Data loading and validation – building repeatable load jobs into Snowflake, designing row-level and aggregate checks, and documenting reconciliation so stakeholders can trust the numbers. Acceptance criteria • Clean, reusable ETL scripts committed to our Git repo. •...
...Looker and Snowflake, and I need a professional who can partner with me full-time during CST business hours. My chief objective is to optimise every step of our data workflow—from the moment raw data lands in Snowflake, through complex transformations in Alteryx, all the way to polished visual stories in Looker. Day to day you’ll be hands-on with SQL scripting, performance tuning, automated scheduling and any troubleshooting that keeps the pipeline healthy and fast. You’ll also advise on best-practice design decisions so we avoid technical debt as new sources and reporting requirements appear. Success for me looks like: • Faster, reliable end-to-end runs with visible performance gains • Clean, reusable Alteryx workflows that are version-con...
I will need a sample of masking, SHA25, Salting, anonymizing in a Snowflake table. Snowflake table: Create or replace table EMPLOYEE ( ID VARCHAR(10) NOT NULL, FIRST_NAME VARCHAR(50) NOT NULL, LAST_NAME VARCHAR(50) NOT NULL, JOB_TITLE VARCHAR(100) NOT NULL ); ID = masking FIRST_NAME = anonymized LAST_NAME = anonymized Add your own sample data to the table. Provide the how-to - ID masking with a masking policy when inserting the data into the table - FIRST_NAME with anonymization and salting technique when inserting the data into the table - LAST_NAME with anonymization and salting technique when inserting the data into the table Salting technique: I have a sample of the salting function. If there is a better way to do this then please provide your suggestion. CREATE OR...
I need an expert-level Alteryx developer who can devote full-time, Central Standard Time hours to my data analytics workload. The heart of the role is designing, building, and refining Alteryx workflows, then surfacing the results through Snowflake and Looker while writing clean, efficient SQL along the way. Your day-to-day will center on turning raw data into reliable, production-ready pipelines. That means troubleshooting existing jobs, automating data prep, and collaborating closely with stakeholders during CST business hours for stand-ups, ad-hoc questions, and quick iterations. Clear communication, screen-sharing, and concise documentation are all must-haves. Each week I’ll look for: • Working Alteryx workflows committed to our repo • Documentation that deta...
...You’ll also touch Looker dashboards and query Snowflake with SQL whenever insights need to flow into reports or ad-hoc investigations. The core of the role is hands-on data analysis: blending diverse data sources, identifying trends, and packaging findings in a way that lets stakeholders make decisions fast. Because this is an immediate need, I’d like you to be ready to start as soon as we align on expectations. Strong communication over Slack or Teams, quick turnarounds on questions, and the discipline to document each workflow or query you create will be key to success. Deliverables I expect: • Clean, well-documented Alteryx workflows delivering agreed-upon outputs • Timely exploratory analyses and insight summaries drawn from Snowflake data ...
Job Title: Senior Snowflake Governance Engineer Budget: ₹22K – ₹24K Duration: 2 hours/day (initially, may increase) Time: US Morning Hours (10:00 AM CT – 2:00 PM CT) / IST Late Night (approx. till 1:00 AM IST) Job Description: We are looking for a Senior Snowflake Governance Engineer with strong experience in implementing and managing Snowflake governance frameworks. The candidate will be responsible for managing data access, security policies, role-based access control, and governance best practices within the Snowflake environment. The role also involves working with infrastructure automation and supporting data governance initiatives. Tech Stack: Snowflake Governance (RBAC, Security Policies, Data Access Control) Terraform (Infrastruct...
...with extracting a small, publicly available sample data set (CSV, JSON, or a simple relational dump), then cleaning and transforming it—deduplicating records, resolving missing or inconsistent values, and normalising key fields where needed. Once the data is tidy, it has to be loaded twice: first into a staging target (a plain relational table or file storage) and then into a basic star- or snowflake-style data-warehouse schema so I can run simple analytical queries afterward. I will need the full project files, transformation jobs, and a concise write-up that walks through each step, explains the design decisions, and shows the final row counts before and after every major operation. Screenshots or log excerpts that prove the pipeline runs end-to-end are essential. A...
...and optimize data models for analytics and reporting Work with cloud data platforms (AWS, Azure, or GCP) Implement data validation, quality checks, and monitoring Build dashboards and actionable insights using BI tools Collaborate on performance tuning and data architecture improvements - Required Skills Strong experience with SQL and Python Expertise in modern data stack tools (e.g., Snowflake, BigQuery, Redshift, Databricks) Hands-on experience with orchestration tools (Airflow or similar) Experience with dbt or data transformation frameworks Knowledge of cloud platforms (AWS/Azure/GCP) Strong analytical thinking and problem-solving skills - Nice to Have Experience with real-time data processing (Kafka, streaming frameworks) Experience with machine learning pipeli...
We are looking for experienced Data Analysts, Data Engineers, or BI Analysts to test and complete an onboarding flow for our new data platform. The goal is to generate a functional "context layer" using your own live data environment. The Task 1. Onboard: Connect your own Data Warehouse (Snowflake, BigQuery, Redshift, etc.). 2. Model: Go through a short modeling flow to create a context layer for your tables. 3. Stress-Test: Ask 30 natural language questions based on your data (e.g., "What was the YoY growth for [Product X] in Q3?"). 4. Validate: Provide feedback on whether the platform answered correctly. If it failed, you will identify why (e.g., wrong join, misunderstood metric, or schema ambiguity). Requirements (Strict) - Active DWH Access: You must have a...
I’m preparing for the Databricks Data Analyst and Data Engineer certifications and want a structured, hands-on tutoring program that also deepens my Snowflake skills. The goal is to become confident building end-to-end data pipelines, running analytics, and understanding platform architecture well enough to pass the exams and perform the work in practice. Focus areas Databricks • Data processing & analytics with PySpark/SQL and Delta Lake • Machine learning workflows inside the Databricks environment • Workspace, cluster, job, and Lakehouse architecture Snowflake • Core data-warehousing concepts and best practices • Query tuning and overall performance optimisation • Security features: RBAC, masking, encryption, and a...
This project centers on building and refining our Snowflake-based data warehouse so that the business has a single, trusted source of truth ready for downstream analytics. I have the environment up and running; what’s missing is an expert who can design it properly, populate it efficiently, and keep it secure. Key workstreams include: • Data modeling – create and maintain star/snowflake schemas that align with business processes, applying best practices for clustering and partitioning. • Data integration – design robust ELT pipelines (SQL, Snowpipe, Streams & Tasks or the tools you prefer) to ingest data from multiple operational systems and land it in a clean, query-ready state. • Data security management – set up roles, mask...
As a data operations team, we need an intelligent, automated way to cleanse, enrich, and duplicate large datasets sourced from CSV files. Today, this work is manual, inconsistent, and slow. We want to build an AI‑powered agent—developed in Microsoft Copilot Studio or Microsoft Foundry—that orchestrates the entire ETL process, integrates with Snowflake for storage and transformation, and connects to Dun & Bradstreet (D&B) for enrichment and deduplication. This agent should not only process data but also understand it: identify quality issues, propose corrections, enrich missing details, and guide users through validation when needed. • Build a CSV ingestion pipeline with schema validation and error reporting. • Implement automated data profiling to de...
We are seeking a data analyst familiar with AI tools to integrate various data systems, including Snowflake, Looker, Hex, Argo, Airflow, and SQLMesh. The goal is to create a data model ready for natural language queries. The ideal candidate will have experience in data integration and modeling, with a strong understanding of AI tools and their applications in data analysis.
...one for high concurrency. Think table partitioning (range and list), well-chosen GIN and B-Tree indexes, plus JSONB columns where semi-structured flexibility makes sense. Triggers, constraints and stored procedures must enforce business logic consistently, so the application tier can stay lightweight. Analytics layer Alongside the OLTP schema I need a separate reporting layer—either star or snowflake—that supports fast OLAP queries. It should draw cleanly from the operational tables and be easy to extend for future data-engineering workloads. Migration considerations Only transactional records from the existing system move over; user and inventory data can be regenerated or imported later. I’ll handle the extract, but I’ll need guidance on staging ta...
...Infrastructure: Self-Hosted Node.js environments, REST APIs, External DBs (Supabase/Firebase/etc.) Frontend: React (Admin UI) Nice to Have Proven Wix Experience: Building Wix CLI projects. Using Wix SDK and Platform APIs (Stores, eCommerce, OAuth). Implementing Wix Service Plugins and SPIs. Proven Marketplace Apps Development Experience: E.g. Salesforce / Monday / Shopify / Webflow / Wix / AWS / Snowflake DevOps: Familiarity with Docker, CI/CD pipelines (GitHub Actions), and cloud hosting platforms (AWS, Vercel, Heroku, or similar). Reliability: Experience with logging, monitoring, and error tracking tools....
...(hashed IDs, tokenization, etc.) o Controlled analytics and outputs only (no raw data sharing) 2. Evaluate Build vs Buy o Compare building from scratch vs existing solutions o Pros/cons, cost, scalability, vendor lock-in o Examples of real-world implementations 3. Technology recommendations o Cloud or hybrid architecture o Tools for privacy, access control, encryption o Possible use of BigQuery, Snowflake, AWS Clean Rooms, or custom stack 4. Security & Governance o Data access rules and roles o Auditability and traceability o Compliance-oriented design (financial / credit data context) 5. Deliverables o High-level architecture diagram o Recommended tech stack o Decision framework: build vs buy o Optional: roadmap for implementation Nice to have (big plus) • Experience wi...
...(hashed IDs, tokenization, etc.) o Controlled analytics and outputs only (no raw data sharing) 2. Evaluate Build vs Buy o Compare building from scratch vs existing solutions o Pros/cons, cost, scalability, vendor lock-in o Examples of real-world implementations 3. Technology recommendations o Cloud or hybrid architecture o Tools for privacy, access control, encryption o Possible use of BigQuery, Snowflake, AWS Clean Rooms, or custom stack 4. Security & Governance o Data access rules and roles o Auditability and traceability o Compliance-oriented design (financial / credit data context) 5. Deliverables o High-level architecture diagram o Recommended tech stack o Decision framework: build vs buy o Optional: roadmap for implementation Nice to have (big plus) • Experience wi...
...existing analytics initiative that now needs a dedicated Redshift-based warehouse. The core objective is to design and implement a robust schema in Amazon Redshift, then ingest data coming from three different sources—our operational SQL databases, a set of RESTful APIs, and periodic flat-file drops in CSV or JSON. Here is what I’m aiming for: • A well-structured Redshift warehouse (star or snowflake schema, whichever is most appropriate) built to scale and documented clearly. • Reliable, automated ingestion pipelines for each source type. For SQL we currently use PostgreSQL and MySQL; for APIs the payloads are mostly JSON; the flat files live in S3. • Transformations that standardise data types, handle slowly changing dimensions, and enforce dat...
My interview is coming up quickly and I need to fast-track my understanding of how Generative AI, Snowflake, and Python work—individually and, more importantly, together in production-style workflows. I’m starting from a beginner level, so I’m looking for an instructor who can translate complex ideas into clear, practical takeaways. Here’s what will help me most: • Live demonstrations with explanations as we step through code, Snowflake queries, and model integrations in real time. • Space to interrupt with questions so I can connect the dots on the spot. • Concrete, industry-based examples—think end-to-end pipelines that load data into Snowflake, prep it with Python, and feed it to a GenAI model or prompt-engineering l...
...standing up a brand-new business-intelligence stack in Snowflake and need a reusable migration framework built with dbt and solid SQL. The framework has to pull from two kinds of sources—existing relational databases and the steady stream of CSV/Excel flat files our teams export—then land, cleanse, validate, and automate each load all the way into analytics-ready tables. Here’s what I’m after: • A dbt project structured for easy extension, complete with models, seeds, macros, and tests that enforce data-quality rules at every stage. • Parameterised SQL and Snowflake tasks/procedures that orchestrate fully automated loads, from raw landing zones through staging to final marts. • Built-in monitoring and reporting—leveraging ...
...reliability and craftsmanship. The overall look should echo classic signage—think bold, slightly weathered lines and a touch of nostalgia—while still feeling crisp enough for today’s web and print use. Key elements to weave in: • My company’s initials (I’ll share them once we start). • A distinctive icon that clearly hints at heating and cooling—something along the lines of a flame, sun, or snowflake worked into the lettering. Color direction: lean into warm reds, oranges, and yellows. The palette needs to stay cohesive and legible against light and dark backgrounds, so feel free to propose subtle gradients or accent tones if it helps the design pop. Final deliverables I need: • Vector artwork (AI, EPS, and layered PDF) &b...
I am rolling out a new environment that spans Terraform-managed infrastructure, automated Github Actions workflows, and integrations with Snowflake and Mulesoft. The area where I need the most hands-on help is AWS—specifically provisioning, securing, and optimising EC2 instances, S3 storage buckets, and a series of serverless Lambda functions that glue everything together. Here’s the flow I’m targeting: • Terraform drives all resource creation so our stacks remain fully reproducible. • Github Actions handles CI/CD, triggering Terraform plans/applies and Lambda deployments on every merge. • Data exchanged between Snowflake and our micro-services is exposed through Mulesoft APIs running behind AWS resources. What I need from you: a cle...
...in Data Engineering, AI/ML, and Advanced Analytics across industries such as SaaS, FinTech, Healthcare, and EdTech. We have also built 6 proprietary products, reflecting our strong product engineering DNA. is ISO 27001 certified and SOC 2 Type II compliant, demonstrating our strong commitment to security and compliance. We are a trusted partner with AWS, Google Cloud Platform (GCP), Snowflake, and Databricks. Headquartered in Phoenix, operates globally with offices in the USA (Chandler, AZ), Canada (Toronto), and India (Ahmedabad and Bangalore). Location: Ahmedabad | Remote Work Type: Remote (3 Months Contract) Primary Skills: QA Automation, SDET, API Testing, Azure Cloud, CI/CD, .NET Experience: 4+ Years Role Overview: We are looking for a QA / SDET who will be responsible
...probabilistic modeling, statistical inference, and experimentation frameworks (A/B testing, causal inference). Can collect, clean, and transform complex datasets into structured formats ready for modeling and analysis. Have experience designing and evaluating predictive models, using metrics like precision, recall, F1-score, and ROC-AUC. Are comfortable working with large-scale data systems (Snowflake, BigQuery, or similar). Are curious about AI agents, and how data can shape the reasoning, adaptability, and behavior of intelligent systems. Enjoy collaborating with cross-functional teams — from engineers to research scientists — to define meaningful KPIs and experiment setups. This listing is only for people residing in India. Primary Goal of This Role To desig...
...values, o Collect relevant dimension values, o Collect SimKey(s) derived from the current selection, o Read required Qlik variables (see section 6). 2. Build a JSON payload (batch). 3. Send the payload via HTTPS POST to a custom backend endpoint: / 4. Handle backend response: o Success o Validation errors o Backend errors ❌ No file-based writeback (CSV, QVX, XLSX) ❌ No Snowflake-managed backend ❌ No reload-trigger-only logic ________________________________________ 6. Variables & Recalculation Control Before or during the writeback action, the extension must also be able to: • Set or update Qlik variables (e.g. recalculation flags, mode selectors), • These variables will determine: o How the backend recalculates metrics, o Which logic path is app...
I’m almost ready to sit the Snowflake certification but still lack high-quality practice exams that truly reflect the test. I need a set of original, full-length exams—timed and weighted like the real thing—with clear answer keys and explanations so I can understand not just what’s right but why. Scope • Three complete mock exams at minimum, each covering the three focus areas I selected: data warehousing concepts, SQL queries, and security & data protection. • Balanced question distribution and difficulty that mirrors Snowflake’s blueprint. • Detailed rationales for every answer, including references to official documentation where relevant. • Delivery in a format I can take repeatedly (PDF, Google Form, or a lightweigh...
...ownership of the full MS SQL-to-Power BI pipeline. The end goal is clear: deliver rock–solid data analytics that put timely, actionable Sales insights in front of our stakeholders. You’ll start by refining and expanding our MS SQL Server environment—creating or tuning tables, views, stored procedures, and functions so queries run quickly even on very large fact tables. From there, design a star or snowflake model that makes sense for reporting, then build the ETL logic to keep it populated and up to date. Our source of truth is MS SQL Server, and the semantic layer will live in Power BI; if you’re comfortable incorporating other sources such as flat files or APIs later, that flexibility will be a plus. Once the data model is solid, craft interactive Powe...
...into trends quickly. My data is sales-focused and can be quite complex—multiple tables, different time windows, and a mix of transactional and summary files. While revenue, units sold, and customer acquisition matter, I’m ultimately looking for a flexible build that can accommodate additional metrics without starting from scratch. Key expectations • Connect to the data sources I provide (Snowflake and SQL). • Build reusable data models so updates flow straight through. • Design sleek, filter-driven visuals with clear hierarchy and branded colour palette. • Optimise performance so the workbook remains fast even as the dataset grows. • Package and hand over a .twbx plus a short Loom walkthrough so my team can maintain it. If you&rsq...
...design exercise If your portfolio is theme-based, Canva-driven, or animation-heavy, do not apply. What This Project Requires Calm, authoritative, enterprise-first UX/UI Strong typography, spacing, and grid discipline Ability to present complex systems and decision flows clearly Mature design restraint (no visual noise) Clean, fast, SEO-ready development Reference standard: McKinsey / Palantir / Snowflake (enterprise sections) Scope (Phase 1) Home Products (multiple offerings) Solutions / Use Cases About Contact Stack: Webflow / / Custom (recommend and justify) Who Should Apply 5+ years in enterprise / B2B / SaaS Can challenge poor ideas, not just execute instructions Comfortable with strategy-heavy, founder-led briefs Portfolio shows restraint, clarity, and authority Mandatory A...
Job Summary: We’re looking for a Staff-level Salesforce Architect with deep hands-on expertise across core clouds, strong Data Cloud experience, and the ability to translate business outcomes into scalable multi-cloud architectu...customer leadership, engineering teams, and business stakeholders ● Demonstrated leadership experience guiding teams and driving large-scale architectural initiatives ● Certifications such as Salesforce CTA, Application Architect, System Architect, Marketing Cloud Architect, or Data Cloud Consultant ● Experience with Tableau, MuleSoft, or advanced Marketing Cloud features is a plus ● Experience with Snowflake, reverse ETL tools, or enterprise data lakes is a plus ● Knowledge of enterprise data governance, metadata management, and cataloging is good t...
...language queries against Snowflake tables to surface relevant business outcomes (e.g., sales trends, inventory levels, customer loyalty metrics). Train and fine-tune the chatbot model, iterating when errors occur to improve accuracy and reliability. Business Expertise & Data Correlation Understand retail business processes and leadership priorities (sales dashboards, merchandising KPIs, supply chain metrics). Correlate leadership questions with Snowflake data structures, ensuring outputs align with business expectations. Articulate expected business outcomes from data queries, bridging the gap between technical solutions and executive decision-making. Technical Solution Development Write and optimize SQL queries to extract, transform, and deliver insights from Snowf...