DevOps Engineer

Work set-up: 
Full Remote
Contract: 
Experience: 
Mid-level (2-5 years)
Work from: 

Offer summary

Qualifications:

5–7 years of experience in DevOps or cloud infrastructure roles., Hands-on experience with Apache Airflow in production environments., Proficiency in cloud platforms such as AWS, GCP, or Azure., Strong scripting skills in Python and Bash, with knowledge of Infrastructure as Code tools like Terraform..

Key responsibilities:

  • Design and implement Airflow-based data workflows for orchestration and monitoring.
  • Manage Airflow DAGs across multiple cloud environments, ensuring reliability.
  • Deploy and scale Airflow on Docker, GKE, EKS, AKS, or VM-based cloud instances.
  • Automate infrastructure provisioning and design scalable, secure cloud architectures.

BDX - Builders Digital Experience, LLC logo
BDX - Builders Digital Experience, LLC Real Estate Management & Development SME https://thebdx.com/
51 - 200 Employees
See all jobs

Job description

DevOps Engineer – Airflow & Cloud Implementation Specialist (5–7 Years Experience)

 Location: Faridabad / Remote (WFH)
  Job Type: Full-time
  Experience Level: Mid to Senior

At Zonda Home, we are redefining data-driven customer experiences through automation, scalability, and cloud-native architecture. We are seeking a skilled DevOps Engineer with a deep focus on Apache Airflow orchestration and cloud infrastructure implementation. In this role, you will design, deploy, and optimize data pipelines and CI/CD processes across cloud environments, supporting scalable data integration and high-performing systems.

You’ll play a key role in bridging DevOps and Data Engineering, enabling robust data orchestration, reliable infrastructure automation, and cloud-first delivery models.

 Key Responsibilities

  • Design and implement Airflow-based ETL/ELT workflows for data pipeline orchestration, scheduling, and monitoring.
  • Manage Airflow DAGs across multiple environments, ensuring modularity, reliability, and maintainability.
  • Deploy and scale Airflow on Docker (GKE/EKS/AKS) or VM-based cloud instances.
  • Automate infrastructure provisioning using Terraform, or Cloud Deployment Manager.
  • Design and manage scalable, secure architectures on GCP, AWS, or Azure (preferred: AWS).
  • Optimize data integration between cloud-native storage (e.g., GCS, S3) and cloud Datawarehouse like Big Query, Snowflake, or Redshift.
  • Implement CI/CD pipelines using tools like GitHub Actions, GitLab CI/CD, or Jenkins, including DAG testing and deployments.
  • Ensure observability using tools like Prometheus, Grafana, Stack driver, and integrate alerts into collaboration platforms (e.g., Slack, Teams).
  • Automate and manage secrets, configurations, and security policies via Vault, KMS, or Secrets Manager.
  • Enable cost-efficient cloud usage through performance tuning, autoscaling, and budget monitoring.

Required Skills & Experience

  • 5–7 years of experience in DevOps or cloud infrastructure roles.
  • 5+ years of hands-on experience with Apache Airflow (authoring, deploying, managing DAGs in production).
  • Proficiency in cloud platforms (AWS preferred; GCP/Azure also acceptable).
  • Strong scripting skills in Python and Bash; experience with Airflow custom operators is a plus.
  • Solid experience with Infrastructure as Code (IaC) tools like Terraform
  • Strong understanding of data engineering principles, batch scheduling, and data reliability.
  • Hands-on with Docker and Kubernetes for orchestrating microservices and Airflow deployments.
  • Knowledge of CI/CD pipelines, GitOps, and automated release processes.
  • Experience integrating Airflow with Big Query, Cloud SQL, S3/GCS, APIs, and messaging systems like Pub/Sub or Kafka.
  • Familiarity with monitoring and alerting best practices.

🎓 Preferred Qualifications

  • Bachelor’s degree in computer science, Data Engineering, or related technical field.
  • Certifications in GCP, AWS, or DevOps tools (e.g., GCP Professional DevOps Engineer, CKA).
  • Exposure to AI and ML pipeline orchestration, data quality frameworks, or metadata management tools (e.g., Great Expectations, Open Lineage) is a bonus.

 Why Join Zonda Home

  • Own and lead the Airflow and cloud architecture strategy in a high-impact role.
  • Work with a forward-thinking team using modern DevOps, data, and cloud-native tooling.
  • Be part of a growing engineering culture that values automation, transparency, and innovation.
  • Enjoy flexible remote work, autonomy, and the opportunity to influence enterprise-scale delivery.

 

Required profile

Experience

Level of experience: Mid-level (2-5 years)
Industry :
Real Estate Management & Development
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Collaboration
  • Problem Solving

DevOps Engineer Related jobs