Logo for Weekday (YC W21)

Sr ETL Engineer

Roles & Responsibilities

  • 5+ years of experience in data engineering or data architecture with hands-on ETL/ELT development
  • Proficiency with Databricks, Apache Airflow, Spark, Delta Lake, Python, and SQL; experience in cloud environments (Azure, AWS)
  • Degree in Computer Science or Information Technology (preferred)
  • Experience collaborating with ML teams on data workflows involving LLMs/NLP and implementing governance and best practices

Requirements:

  • Design and lead the development of scalable, cloud-native data platforms using Databricks, Spark, Delta Lake, and Apache Airflow
  • Architect and implement high-performance ETL/ELT pipelines and lead projects related to data architecture, feature engineering, and AI-driven data workflows
  • Collaborate with ML teams on LLM/BERT-based data workflows and leverage GenAI tools to improve developer efficiency and data quality
  • Focus on architecture, governance, and best practices while applying hands-on data engineering to deliver robust solutions

Job description

This role is for one of the Weekday's clients

Salary range: Rs 1500000 - Rs 3000000 (ie INR 15-30 LPA)

Min Experience: 7 years

Location: Remote (India)

JobType: full-time

Our solutions empower enterprises adopting AI at scale with proactive compliance and sustainable automation. The company is also connected to innovative startups such as Amelia.ai.

This is a fully remote role based in INDIA, allowing you to work from anywhere within the country.

The position supports work-from-home arrangements anywhere in INDIA, with the job location being within India.

We are seeking a Senior Data Architect to design and lead the development of scalable, cloud-native data platforms utilizing Databricks, Apache Airflow, Spark, and Delta Lake. The ideal candidate will have expertise in data architecture, ETL/ELT strategies, and orchestration, creating high-performance pipelines using Python and SQL, and collaborating with ML teams on LLM/BERT-based data workflows. This role focuses on architecture, governance, and best practices, along with hands-on data engineering experience.

In this position, you will architect and implement high-performance data engineering solutions, build robust ETL/ELT pipelines, and lead projects related to data architecture, feature engineering, and AI-driven data workflows. You will extensively work with Databricks, Spark, Delta Lake, Airflow, and Python, while leveraging modern GenAI tools to enhance developer efficiency and improve data quality.

Requirements

Key Skills:

Databricks • Apache Airflow • Spark • Delta Lake • Data Architecture • ETL/ELT • Python • SQL • Azure • LLMs/NLP

Experience: Minimum 5 years | Degree in CS/IT preferred

Skills

Databricks

Aws Databricks

Azure Databricks

The Databricks Lakehouse Platform

ETL Developer Related jobs

Other jobs at Weekday (YC W21)

We help you get seen. Not ignored.

We help you get seen faster — by the right people.

🚀

Auto-Apply

We apply for you — automatically and instantly.

Save time, skip forms, and stay on top of every opportunity. Because you can't get seen if you're not in the race.

AI Match Feedback

Know your real match before you apply.

Get a detailed AI assessment of your profile against each job posting. Because getting seen starts with passing the filters.

Upgrade to Premium. Apply smarter and get noticed.

Upgrade to Premium

Join thousands of professionals who got noticed and hired faster.