DataOps Engineer

Work set-up: 
Full Remote
Contract: 
Experience: 
Mid-level (2-5 years)
Work from: 

Offer summary

Qualifications:

Bachelor's or Master's degree in Computer Science, Engineering, or related field., Over 3 years of experience in DataOps, DevOps, or Data Engineering roles., Proficiency in scripting languages like Python and Bash., Strong knowledge of orchestration tools such as Apache Airflow, Prefect, or Dagster..

Key responsibilities:

  • Design and manage CI/CD pipelines for data applications and models.
  • Develop and maintain infrastructure-as-code for data platform components.
  • Automate data quality checks, validation, and monitoring processes.
  • Collaborate with data teams to optimize data ingestion and transformation pipelines.

Intellectsoft logo
Intellectsoft Computer Software / SaaS SME https://www.intellectsoft.net/
51 - 200 Employees
See all jobs

Job description

Join our team in building a modern, high-impact Analytical Platform for one of the largest integrated resort and entertainment companies in Southeast Asia. This platform will serve as a unified environment for data collection, transformation, analytics, and AI-driven insights—powering decisions across marketing, operations, gaming, and more.

You’ll work closely with Data Architects, Data Engineers, Business Analyst and DevOps Engineers to design and implement scalable data solutions.

Requirements

  • Bachelor's or Master’s degree in Computer Science, Engineering, or a related field.
  • 3+ years of experience in DataOps, DevOps, or Data Engineering roles.
  • Proficiency in scripting languages (Python, Bash, etc.).
  • Strong experience with orchestration tools (e.g., Apache Airflow, Prefect, or Dagster).
  • Hands-on experience with cloud platforms (e.g., AWS, GCP, Azure) and cloud-native data tools.
  • Familiarity with CI/CD tools (e.g., GitLab CI, Jenkins, CircleCI).
  • Knowledge of containerization and orchestration technologies (Docker, Kubernetes).
  • Experience with infrastructure-as-code tools (Terraform, CloudFormation).
  • Strong understanding of data privacy, security, and compliance practices
  • Experience with modern data warehouses (e.g., Snowflake, Redshift, Yellowbrick) and ETL/ELT tools.
  • Understanding of data governance, metadata management, and data cataloging tools.
  • Experience collaborating in Agile/Scrum teams and working with version-controlled data models (e.g., via Git).

Nice to have skills

  • Experience with real-time data processing (e.g., Kafka, Spark Streaming).
  • Familiarity with data observability platforms (e.g., Monte Carlo, Datadog, Great Expectations).
  • Experience working in regulated industries (e.g., gaming, finance, hospitality).

Responsibilities:

  • Design, build, and manage CI/CD pipelines for data applications, models, and pipelines.
  • Develop and maintain infrastructure-as-code (IaC) for data platform components.
  • Automate data quality checks, validation, and monitoring processes.
  • Collaborate with data engineers and analysts to optimize data ingestion and transformation pipelines.
  • Implement robust logging, alerting, and observability tools for data pipelines.
  • Manage orchestration frameworks (e.g., Airflow) and ensure timely execution of workflows.
  • Maintain compliance with data governance, privacy, and security policies.
  • Support and troubleshoot production data issues and infrastructure outages.

Benefits

  • 35 absence days per year for work-life balance
  • Udemy courses of your choice
  • English courses with native-speaker
  • Regular soft-skills trainings
  • Excellence Сenters meetups
  • Online/offline team-buildings
  • Business trips

Required profile

Experience

Level of experience: Mid-level (2-5 years)
Industry :
Computer Software / SaaS
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Hospitality
  • Collaboration
  • Problem Solving

Field Engineer (Solutions) Related jobs