Senior DataOps Engineer

extra holidays - extra parental leave - work from anywhere - fully flexible
Work set-up: 
Full Remote
Contract: 
Experience: 
Senior (5-10 years)
Work from: 

Offer summary

Qualifications:

At least 4 years of experience in Data Engineering and cloud platforms., Proficiency in Python, Scala, SQL, and Spark for building data pipelines., Expertise in designing distributed systems and data integration tools like Fivetran, DBT, and Segment., Strong understanding of cloud infrastructure (AWS/Azure) and automation with Terraform..

Key responsibilities:

  • Build, scale, and optimize the data platform using tools like Databricks and Spark.
  • Design and maintain robust data pipelines with Airflow, Fivetran, and DBT.
  • Develop infrastructure and automation to support MLOps practices, including model deployment and monitoring.
  • Collaborate with cross-functional teams to ensure data security, efficiency, and continuous improvement.

MURAL logo
MURAL Internet SME https://www.mural.co
501 - 1000 Employees
See all jobs

Job description

ABOUT THE TEAM

The Data Engineering team at Mural builds scalable, high-performance systems that transform complex data into actionable insights. We power internal analytics and user-facing features, supporting critical products powered by AI/Machine Learning, such as Customer Insights, Personalization, Reporting APIs, and Audit Logs. Our work integrates diverse data sources, optimizes core data models, and ensures seamless access across the organization. Success is measured not only by system reliability and scalability, but also by the business impact we deliver—enabling users to make informed decisions and driving overall customer success. 

YOUR MISSION

As a Senior Data Engineer, you will be instrumental in shaping the future of our data platform. You will work with a global, remote-first team to build, scale, and optimize our data infrastructure using cutting-edge tools like Databricks, Spark, Terraform, and cloud platforms (AWS/Azure). You will collaborate closely with cross-functional teams to ensure the platform is secure, efficient, and continuously evolving. Your work will empower teams to access the data they need, drive insights, and optimize business outcomes. 

WHAT YOU'LL DO
  •  Build, scale, and optimize the data platform using Databricks, Spark, and various data integration tools to ensure high performance, reliability, and cost efficiency.

  • Design and maintain robust data pipelines with Airflow, Fivetran, DBT, Segment, and automate infrastructure using Terraform to enhance operational readiness, security, and governance.

  • Develop and maintain the infrastructure and automation needed to support MLOps practices, including model deployment pipelines, monitoring frameworks, and retraining automation.

  • Use AI tools as a part of your fundamental workflows

  • Prototype, implement, and maintain team projects and features, serving as a technical expert, mentor, and leader

  • Build flexible and maintainable solutions while being accountable for quality, performance, and reliability

  • Elevate the team’s skills and knowledge by participating in technical designs and talks and reviewing and helping improve your and your colleague’s code

  • Contribute to constantly improving the team’s processes and best practices

WHAT YOU'LL BRING
  • 4+ years of hands-on experience in Data Engineering, with a strong focus on building and optimizing data platforms and cloud platforms.

  • Expertise in designing and building distributed systems and data pipelines, focusing on data integration, transformation, and processing using tools like Astronomer, Fivetran, DBT, and Segment, with a focus on both structured and unstructured data.

  • Proficiency in Python, Scala, SQL, and Spark to develop scalable data pipelines, automate data workflows.

  • Solid understanding of cloud infrastructure (AWS/Azure), data platforms (Databricks, Snowflake, Redshift) with hands-on experience in Terraform to automate cloud resource management (e.g., VPCs, subnets).

  • Passion for performance tuning, cost optimization, and ensuring efficient resource usage across the data platform.

  • An outcome-oriented and highly-experimental interest in AI-driven development practices

  • Experience with data governance, security practices, and compliance (e.g., SOC II) to maintain data integrity and privacy.

  • Experience learning new technologies, platforms, stacks, and coming up to speed quickly on large codebases

  • Emotional intelligence with collaboration and listening skills that encourage innovative solutions and diverse perspectives

  • Experience working in a rapid-growth or startup environment

  • Excellent communication skills and the ability to collaborate with distributed teams across time zones, ensuring alignment and success in a fast-paced environment.

Equal Opportunity 

We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Required profile

Experience

Level of experience: Senior (5-10 years)
Industry :
Internet
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Emotional Intelligence
  • Collaboration
  • Communication

Field Engineer (Solutions) Related jobs