Logo for NinjaOne

Senior Data Engineer, India

Roles & Responsibilities

  • Strong experience building ETL/ELT pipelines for large-scale data platforms.
  • Good understanding of tiered data architectures (Bronze/Silver/Gold, medallion model) and how to apply them in production.
  • Hands-on experience with pipeline observability (metrics, logs, alerts, SLAs/SLOs).
  • Proficiency in programming languages (Python, Java, or Scala) and experience with cloud platforms (AWS, Azure, or GCP).

Requirements:

  • Design, build, and maintain scalable ETL/ELT pipelines across batch and streaming workloads.
  • Implement and operate pipelines following a tiered data model (Bronze/Silver/Gold) to ensure clear data contracts, quality boundaries, and reusability.
  • Build pipelines that are observable by default, with strong metrics, logging, tracing, and alerting, and implement data quality checks and automated tests at each data tier.
  • Troubleshoot production issues across pipelines and storage layers using logs, metrics, and traces, and ensure data pipelines comply with security, governance, and compliance requirements.

Job description

Description

About the Role 
We're looking for a Senior Data Engineer to design, build, and operate the reliable, scalable data pipelines powering analytics, AI/ML, and operational workloads.  
This is engineering that matters; production-grade ETL with real observability, rigorous testing discipline, and architectural decisions that scale. You'll work across the full stack of modern data systems, applying strong design principles to build pipelines that don't just run, but run reliably at scale. 
If you're passionate about data engineering done right, where monitoring isn't an afterthought, tests are non-negotiable, and system design is foundational, this is the opportunity for you! 
Location - Remote in India 
What you'll be doing
  • Design, build, and maintain scalable ETL/ELT pipelines across batch and streaming workloads.  
  • Implement and operate pipelines following a tiered data model (e.g., Bronze/Silver/Gold) to ensure clear data contracts, quality boundaries, and reusability.  
  • Build pipelines that are observable by default, with strong metrics, logging, tracing, and alerting.  
  • Implement data quality checks, validations, and automated tests at each data tier to ensure correctness, freshness, and reliability.  
  • Apply strong system design principles to build fault-tolerant, scalable, and maintainable data systems.  
  • Optimise pipeline performance, cost, and reliability through profiling, monitoring, and tuning.  
  • Collaborate with platform, analytics, and ML teams to design well-modelled datasets for downstream consumers.  
  • Participate in architecture and design reviews, contributing to data modelling, ingestion, and observability standards.  
  • Troubleshoot production issues across pipelines and storage layers using logs, metrics, and traces.  
  • Ensure data pipelines comply with security, governance, and compliance requirements. 
  • Other duties as needed.  

About you

Must have:

  • Strong experience building ETL/ELT pipelines for large-scale data platforms.  
  • Good understanding of tiered data architectures (e.g., Bronze/Silver/Gold, medallion model) and how to apply them in production.  
  • Hands-on experience with pipeline observability (metrics, logs, alerts, SLAs/SLOs).  
  • Solid understanding of distributed systems and system design fundamentals.  
  • Experience testing data pipelines, including data quality checks, regression testing, and failure scenarios.  
  • Proficiency in one or more programming languages (e.g., Python, Java, Scala).  
  • Experience with cloud platforms (AWS, Azure, or GCP).  
  • Strong problem-solving and production debugging skills.  

Not necessary but highly regarded: 

  • Experience with streaming platforms (Kafka, Pulsar, Kinesis).  
  • Familiarity with data lakes, lakehouse architectures, or OLAP systems.  
  • Experience with CI/CD for data pipelines and infrastructure-as-code.  
  • Exposure to regulated or high-availability environments. 
About Us  
 
NinjaOne unifies IT to simplify work for more than 35,000 customers in 140+ countries.    
 
The NinjaOne Unified IT Operations Platform delivers endpoint management, autonomous patching, backup, and remote access in a single console to improve efficiency, increase resilience, and reduce spend. By automating IT and managing all endpoints, organizations give employees a great technology experience at work.    
 
NinjaOne is obsessed with customer success and has retained a 98% customer satisfaction score for more than 5 years.    
  
What You’ll Love  
  • Competitive compensation    
  • Pension Scheme   
  • Employee's Provident Fund  
  • Private healthcare   
  • Paid maternity and paternity leave   
  • 12 days of paid sick leave   
  • 18 days of Annual Leave   
  • India Public Holidays based on your location   
  • Other leave benefits, such as Wedding leave 
Additional Information 
This position is NOT eligible for Visa sponsorship. 
  
All qualified applicants will receive consideration for employment without regard to race, colour, religion, sex, sexual orientation, gender identity, national origin, age, disability, genetic information, marital status, veteran status, or any other status protected by applicable law. We are committed to providing an inclusive and diverse work environment. 
  

Data Engineer Related jobs

Other jobs at NinjaOne

We help you get seen. Not ignored.

We help you get seen faster — by the right people.

🚀

Auto-Apply

We apply for you — automatically and instantly.

Save time, skip forms, and stay on top of every opportunity. Because you can't get seen if you're not in the race.

AI Match Feedback

Know your real match before you apply.

Get a detailed AI assessment of your profile against each job posting. Because getting seen starts with passing the filters.

Upgrade to Premium. Apply smarter and get noticed.

Upgrade to Premium

Join thousands of professionals who got noticed and hired faster.