Senior Data Engineer I

extra holidays - fully flexible
Work set-up: 
Full Remote
Contract: 
Experience: 
Senior (5-10 years)

Offer summary

Qualifications:

Proven experience with modern data stack technologies like Snowflake, Airflow, and DBT., Strong skills in Python, SQL, and data pipeline development., Knowledge of SDLC, DataOps, and DevOps best practices., Experience with cloud platforms, especially AWS, and data governance..

Key responsibilities:

  • Build and maintain data orchestration and transformation architectures.
  • Ensure reliable delivery of accurate data for analytics and sharing.
  • Collaborate on automating deployments and establishing best practices.
  • Support platform evolution and promote data governance and DataOps.

Elsevier logo
Elsevier XLarge https://www.elsevier.com
5001 - 10000 Employees
See all jobs

Job description

About the Team:

The Academic Information Systems (AIS) DataOps team is a shared technology group responsible for building, administering, governing, and setting standards for a growing number of strategic data platforms and services. Our capabilities enable data to be extracted, centralized, transformed, transmitted, and analyzed across a range of products in the AIS space. Due to our footprint across the enterprise, we are relied upon to ensure our systems are trusted, reliable and available. The technology underpinning these capabilities includes industry leading data and analytics products such as Snowflake, AstronomerAirflow, Kubernetes, DBT, Tableau, Sisense, Collibra, and KafkaDebezium. Our mission is to enable frictionless experiences for our AIS colleagues and customers so that they can openly and securely consume trustworthy data, enhancing everyday interactions and decisions.

About the Role:

As a Senior Data Engineer I, you will be responsible for helping to create a data infrastructure that is secure, scalable, wellconnected, thoughtfully architected while also building a deep domain knowledge of our business domain. This team is responsible for the complex flow of data across teams, data centers, and organizational boundaries all around the world. This data is the backbone of successful storytelling for AIS colleagues and customers, and it must be curated through several reliable yet costeffective approaches.

Responsibilities:

  • Build and maintain a robust, modern data orchestration and transformation architecture to support both batch and streaming processes.

  • Ensure reliable delivery of clean, accurate data for analytical platforms and data sharing services.

  • Contribute to the development and enforcement of technical and coding standards to mature SDLC practices.

  • Collaborate with DevOps to automate deployments and implement Infrastructure as Code (IaC) for consistent, repeatable environments across regions.

  • Develop modularized components and reusable frameworks, establishing common patterns for easy contribution and reliable deployment.

  • Document and promote best practices by establishing guidelines with stakeholders and sharing knowledge across engineering and product teams.

  • Drive operational efficiency, reliability, and scalability through improvements in logging, monitoring, and observability.

  • Support platform evolution and data governance by identifying capability gaps, implementing necessary tooling and processes, and promoting DataOps through leadership and user feedback initiatives.

    • Requirements:

      • Deploy and govern modern data stack technologies (e.g., Snowflake, Airflow, DBT, Fivetran, Airbyte, Tableau, Sisense, AWS, GitHub, Terraform, Docker) at enterprise scale for data engineering workloads.

      • Develop deployable, reusable ETLELT solutions using Python, advanced SQL, and Jinja for data pipelines and stored procedures.

      • Demonstrate applied understanding of SDLC best practices and contribute to the maturity of SDLC, DataOps, and DevOps processes.

      • Participate actively in Agile delivery, including ceremonies, requirements refinement, and fostering a culture of iterative improvement.

      • Provide thought leadership in the data platform landscape by building wellresearched proposals and driving adoption of change.

      • Design comprehensive technical solutions, producing architecture and infrastructure documentation for scalable, secure, and efficient data platforms.

      • Exhibit deep expertise in AWS data and analytics services, with experience in productiongrade cloud solutions and cost optimization.

      • Apply strong data and technology governance, ensuring compliance with data management, privacy, and security practices, while collaborating crossfunctionally and adapting to evolving priorities.

Required profile

Experience

Level of experience: Senior (5-10 years)
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Collaboration
  • Communication
  • Problem Solving

Data Engineer Related jobs