Logo for Weekday (YC W21)

Senior Azure Fabric Data Engineer

Roles & Responsibilities

  • 7+ years of data engineering experience with Azure Fabric, Databricks, Spark, Delta Lake, PySpark, Python, and SQL
  • Strong experience in ETL/ELT design, data architecture, data migrations, pipeline optimization, and orchestration (Airflow, ADF)
  • Hands-on expertise with Azure data services including Azure Fabric, Azure OneLake, Azure Databricks, Azure Synapse, and Parquet file format
  • Experience integrating GenAI/LLMs and AI-driven data workflows to enhance data quality and developer productivity

Requirements:

  • Design and build high-performance cloud-native ETL/ELT pipelines using Python and SQL on Databricks/Spark with Delta Lake and Parquet
  • Lead data engineering projects focusing on data architecture, feature engineering, data migrations, and governance
  • Collaborate with machine learning teams to integrate AI-driven data workflows and GenAI tools to improve data quality and developer productivity
  • Architect and implement data pipelines leveraging Azure Fabric, Azure Databricks, Delta Lake, Airflow, Azure Synapse, and Azure Data Factory, ensuring governance, best practices, and security

Job description

This role is for one of the Weekday's clients

Min Experience: 7 years

Location: Remote (India)

JobType: full-time

This is a full-time remote position for a Senior Azure Fabric / Databricks Engineer with expertise in Microsoft Fabric.

The candidate has the flexibility to work from home from any location in India.

We are seeking a Senior Azure Fabric Data Engineer who has experience in supporting ETL and ELT development, and who is skilled in utilizing Fabric data engineering capabilities to design and lead scalable, cloud-native data platforms.

The ideal candidate should possess experience with Azure Fabric, Databricks, Azure OneLake, Spark, Delta Lake, data architecture, Azure Data Bricks, Azure Synapse, Data Bricks, PySpark, SQL, Delta Lake concepts, Power BI, the Azure Data Engineering tech stack, pipeline optimization, data migrations, Azure Data Factory (ADF), Python, SQL ETL/ELT strategy, and orchestration. You will be responsible for building high-performance pipelines using Python and SQL and collaborating with machine learning teams. This role will emphasize architecture, governance, and best practices, requiring hands-on experience in data engineering.

In this position, you will design and construct high-performance data engineering solutions, create robust ETL/ELT pipelines, and lead projects focusing on data architecture, feature engineering, and AI-driven data workflows. Your work will extensively involve Databricks, Spark, Delta Lake, Airflow, and Python, while integrating modern GenAI tools to enhance developer productivity and data quality.

Requirements

Key Skills:

Microsoft Fabric, Databricks, Spark, Delta Lake, Data Architecture, ETL/ELT, Python, SQL, Azure, LLMs/NLP, Azure Fabric, Azure OneLake, Azure Data Bricks, pipeline optimization, data migrations, Parquet file format, Azure Data Factory (ADF), PySpark, Python, SQL, Azure Synapse.

Experience: 7+ years | Preferred degree in Computer Science/Information Technology

Skills

Microsoft Fabric

ETL

ELT

Data Engineering

Data Pipelines

Azure Data Lake

Data Management

Data Architecture

Azure Data Factory

Azure Data Fabric

Azure OneLake

Azure Databricks

PySpark

ADF

Data Engineer Related jobs

Other jobs at Weekday (YC W21)

We help you get seen. Not ignored.

We help you get seen faster — by the right people.

🚀

Auto-Apply

We apply for you — automatically and instantly.

Save time, skip forms, and stay on top of every opportunity. Because you can't get seen if you're not in the race.

✨

AI Match Feedback

Know your real match before you apply.

Get a detailed AI assessment of your profile against each job posting. Because getting seen starts with passing the filters.

Upgrade to Premium. Apply smarter and get noticed.

Upgrade to Premium

Join thousands of professionals who got noticed and hired faster.