Analytics Engineer

Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

2+ years of experience in SQL, including complex joins and optimization methods., Strong understanding of data modeling and warehouse design, particularly Kimball-style dimensional modeling., Experience using dbt in production environments, with a focus on testing and documentation., Familiarity with version control systems like GitHub..

Key responsabilities:

  • Build and maintain dbt models to transform raw data into clean and accessible datasets.
  • Translate business and analytics requirements into scalable data models and design data warehouse schemas.
  • Implement and maintain dbt tests to ensure data quality and model accuracy, and document data models for cross-functional use.
  • Optimize dbt models and SQL queries for performance, working with Snowflake and ensuring integration with data catalogs.

Multiverse logo
Multiverse Scaleup https://www.multiverse.io/
501 - 1000 Employees
See all jobs

Job description

We’re on a mission to provide equitable access to economic opportunity, for everyone.

We close critical skill gaps in the workforce through a new kind of apprenticeship that combines work and learning. We begin by recognizing high-potential individuals both inside and outside of a company's current workforce and then we create applied, guided and equitable learning programs, with measurable impact. Because we believe the world needs a better way to match its potential.

We work with over 1,500 leading companies including the likes of Microsoft, Citi and Just Eat to help solve their business-critical problems, and we’ve trained over 16,000 professional apprentices in the tech and data skills of the future. This is made possible by our global team who are driven to achieve a mission that matters, together.

Join Multiverse and help us set a new course for work.

What we need

We’re looking for an Analytics Engineer to help build and maintain the data models that power analytics and data science across the business. You’ll focus on developing robust and scalable dbt pipelines and contributing to the evolution of our data platform, ensuring that data is accessible, trusted, and well-structured.

This role is hands-on and ideal for someone with a strong technical foundation who enjoys solving data problems, writing clean and efficient SQL, and collaborating with analysts, engineers, and product teams.

This role sits within the Data & Insight team, reporting to the Head of Analytics Engineering. We’re looking for someone who’s detail-oriented, solution-driven, and pragmatic - someone who takes ownership of their work and is excited to build product-focused data models.

What you’ll work on

Data Modelling & Transformation

  • Build and maintain dbt models to transform raw data into clean, documented, and accessible data sets

  • Translate business and analytics requirements into scalable data models

  • Design and implement data warehouse schemas using dimensional modelling techniques (fact and dimension tables, slowly changing dimensions, etc.)

  • Participate in design and code reviews to improve model design and query performance

Testing, Documentation, and CI/CD

  • Implement and maintain dbt tests to ensure data quality and model accuracy

  • Document data models clearly to support cross-functional use

  • Use GitHub and CI/CD pipelines to manage code and deploy changes safely and efficiently

Performance & Architecture

  • Optimise dbt models and SQL queries for performance and maintainability

  • Work with Snowflake; developing on top of a data lake architecture

  • Ensure dbt models are well-integrated with data catalogs and accessible for downstream use

What we’re looking for

Required Skills & Experience

  • 2+ years of building and optimising complex SQL (including complex joins, window functions and optimisation methods)

  • Strong understanding of data modelling and warehouse design (e.g., Kimball-style dimensional modelling)

  • Experience using dbt in production environments, including testing and documentation

  • Familiar with version control (GitHub)

  • Experience tuning dbt models and SQL queries for performance

  • Able to independently transform business logic into technical implementation

  • Comfortable participating in and contributing to code reviews

Desirable - but not required

  • Experience with Snowflake

  • Experience with CI/CD for data workflows

  • Familiarity with Python/Airflow for data transformation or orchestration tasks

  • Experience with data visualisation tools (e.g., Tableau, Looker)

  • Working knowledge of infrastructure-as-code tools like Terraform

Required profile

Experience

Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Solutions Focused
  • Detail Oriented

Related jobs