Logo for Streamline

Sr. Data Engineer-Azure, Snowflake, Python, SaaS

Roles & Responsibilities

  • Strong expertise with Azure data stack (Azure Functions, Azure Data Factory, Event/Service Bus, storage) and Snowflake for analytical workloads
  • Proven experience designing and operating production data pipelines, including CI/CD, observability, and incident response for data systems
  • Advanced SQL and performance tuning skills, with experience optimizing transformations and Snowflake queries for cost and speed
  • Solid programming experience in Python or similar for building reusable ETL components, libraries, and automation

Requirements:

  • Design, develop, and deploy Azure Functions and broader Azure data services to extract, transform, and load data into Snowflake data models and marts
  • Implement automated data quality checks, monitoring, and alerting to ensure accuracy, completeness, and timeliness across all pipelines
  • Optimize workloads to reduce cloud hosting costs, including right-sizing compute, tuning queries, and leveraging efficient storage and caching patterns
  • Build and maintain ELT/ETL workflows and orchestration to integrate multiple internal and external data sources at scale

Job description

Sr. Data Engineer

Who We Are
Streamline is a fast-growing consultancy specializing in Enterprise Mobility, Product Engineering, and IT Transformation. We’re building something special - a team of top-tier strategists, engineers, and designers who thrive on solving hard problems for enterprise clients. If you want to be part of a company where your contributions are visible from day one, keep reading.

Role Summary

The Senior Data Engineer designs, builds, and optimizes data pipelines that move, transform, and load data into Snowflake using Azure services and serverless components. The role focuses on production-grade engineering: automating data quality, improving reliability, and continuing to optimize cloud infrastructure costs.

Role Responsibilities
  • Design, develop, and deploy Azure Functions and broader Azure data services to extract, transform, and load data into Snowflake data models and marts.
  • Implement automated data quality checks, monitoring, and alerting to ensure accuracy, completeness, and timeliness across all pipelines.
  • Optimize workloads to reduce cloud hosting costs, including right-sizing compute, tuning queries, and leveraging efficient storage and caching patterns.
  • Build and maintain ELT/ETL workflows and orchestration to integrate multiple internal and external data sources at scale.
  • Design data pipelines that support both near real-time streaming data ingestion and scheduled batch processing to meet diverse business requirements.
  • Collaborate with engineering and product teams to translate requirements into robust, secure, and highly available data solutions.
Qualifications & Skills
  • Strong expertise with Azure data stack (e.g., Azure Functions, Azure Data Factory, Event/Service Bus, storage) and Snowflake for analytical workloads.
  • Proven experience designing and operating production data pipelines, including CI/CD, observability, and incident response for data systems.
  • Advanced SQL and performance tuning skills, with experience optimizing transformations and Snowflake queries for cost and speed.
  • Solid programming experience in Python or similar for building reusable ETL components, libraries, and automation.
  • Experience with streaming and batch ingestion patterns (e.g., Kafka, Spark, Databricks) feeding Snowflake.
  • Familiarity with BI and analytics tools (e.g., Power BI, Grafana) consuming Snowflake data models.
  • Background in DevOps practices, including containerization, CI/CD pipelines, and infrastructure-as-code for data platforms.
  • Experience with modern data transformation tools (e.g., dbt) and data observability platforms for monitoring data quality, lineage, and pipeline health.
Additional Requirements
  • Ability to adapt to a fast-paced and dynamic work environment.
  • Self-motivated and able to work independently with minimal supervision, taking initiative to drive projects forward.
  • Expert-level problem-solving skills with the ability to diagnose complex data pipeline issues and architect innovative solutions.
  • Proven ability to integrate and analyze disparate datasets from multiple sources to deliver high-value insights and drive business impact.
  • Strong problem-solving skills and attention to detail.
  • Proven ability to manage multiple priorities and deadlines.
  • Passionate about staying current with emerging data engineering technologies and best practices, driving innovation to enhance product capabilities and maintain competitive advantage.
  • Experience developing and architecting SaaS platforms with a focus on scalability, multi-tenancy, and cloud-native design patterns.

What We Offer

  • A ground-floor opportunity to shape design culture and make an outsized impact at a growing company.
  • Direct access to leadership and accelerated career growth - your work will be seen, and it will matter.
  • A small, collaborative team where your voice matters and great ideas get implemented fast.
  • Competitive compensation and benefits with room to grow as we grow.

Data Engineer Related jobs

Other jobs at Streamline

We help you get seen. Not ignored.

We help you get seen faster — by the right people.

🚀

Auto-Apply

We apply for you — automatically and instantly.

Save time, skip forms, and stay on top of every opportunity. Because you can't get seen if you're not in the race.

AI Match Feedback

Know your real match before you apply.

Get a detailed AI assessment of your profile against each job posting. Because getting seen starts with passing the filters.

Upgrade to Premium. Apply smarter and get noticed.

Upgrade to Premium

Join thousands of professionals who got noticed and hired faster.