DataOps Engineer

Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

Strong experience with AWS cloud services, particularly EKS and EMR., Proficient in Apache Spark and distributed data processing., Hands-on experience with infrastructure-as-code tools like Terraform or CloudFormation., Familiarity with CI/CD pipelines and monitoring systems..

Key responsibilities:

  • Build and maintain Spark jobs for data cleansing and transformation.
  • Manage infrastructure as code and deploy using CI/CD best practices.
  • Set up monitoring and alerting systems for data infrastructure.
  • Handle access and security management in Snowflake.

Lumenalta (formerly Clevertech) logo
Lumenalta (formerly Clevertech) SME https://lumenalta.com/
501 - 1000 Employees
See all jobs

Job description

About the Role

We're looking for a Data DevOps Engineer with deep expertise across both data engineering and DevOps disciplines to join our growing team. This role is critical to our mission of building scalable, secure, and high-performing data infrastructure on AWS. You’ll work with cutting-edge tools and a modern cloud stack, ensuring data flows seamlessly from raw ingestion to business-ready insights.


What You’ll Do

  • Data Transformation & Delivery
  • Build and maintain Spark jobs running on EKS/EMR to cleanse and transform raw data into current, consumable layers
  • Leverage DBT for data modeling and transformation pipelines (DBT experience is a plus—not a must)
  • Manage and optimize consumption layers using Trino
  • Implement data quality checks and performance tuning across pipelines


DevOps & DataOps

  • Manage infrastructure as code and deploy using CI/CD best practices
  • Set up and maintain monitoring and alerting (CloudWatch, etc.)
  • Secure infrastructure using AWS Secrets Manager and Cognito
  • Handle access and security management in Snowflake
  • Maintain and enhance scheduling and orchestration using Airflow


What You Bring

  • Strong experience with AWS cloud services (especially EKS, EMR, CloudWatch, Secrets Manager, Cognito)
  • Proficient in Apache Spark and distributed data processing
  • Experience with Trino (or Presto) and building efficient consumption layers
  • Familiarity with DBT, or an interest in learning it
  • Hands-on with infrastructure-as-code (e.g., Terraform or CloudFormation)
  • Deep understanding of CI/CD pipelines, monitoring, and alerting systems
  • Proven experience managing secure and compliant data systems
  • Proficiency with Airflow or similar workflow orchestration tools
  • Strong ownership mindset and ability to operate across the stack


Bonus Points

  • Experience with Snowflake security/access controls
  • DBT experience
  • Previous experience in high-scale, data-intensive environments


Why Join Us?

  • Work with a modern data stack and a team that values full ownership and cross-functional skillsets
  • Remote-first team with flexible work culture
  • Opportunity to shape and scale critical infrastructure from the ground up


What's it like to work at Lumenalta?

Required profile

Experience

Spoken language(s):
English
Check out the description to know which languages are mandatory.

Field Engineer (Solutions) Related jobs