Logo for Evnek

Senior Data Engineer

Roles & Responsibilities

  • 7-9 years of experience as a Data Engineer with recent hands-on production experience (last 48 months).
  • Strong Python for data pipelines, scripting, and automation, and SQL for data transformation and modeling.
  • Experience with Microsoft Azure and Microsoft Fabric, ELT pipeline development, and Git; metadata management and data catalog onboarding.
  • Agile/Scrum experience, data product design, data quality assessment, and RESTful API development.

Requirements:

  • Design, build, and maintain scalable data pipelines and ELT workflows with orchestration and automation using Python.
  • Capture and onboard metadata into enterprise data catalogs; implement data quality checks and governance.
  • Develop and maintain RESTful APIs, automation scripts (Python, Bash, PowerShell), and CI/CD pipelines for automated deployment.
  • Create dashboards and reporting solutions using Microsoft Fabric and Power BI; design data models for analytics and collaborate with stakeholders.

Job description

This is a remote position.

Job Title: Senior Data Engineer
Experience: 7–9 Years
Location: Remote
Notice Period: Immediate Joiners Only

About the Engagement

We are looking for experienced Senior Data Engineers to join a large-scale, multi-year Data Mapping and DataOps Platform modernisation programme for a major government ministry.

This strategic initiative focuses on transforming how data is managed, governed, and leveraged as a core enterprise asset.

Key Focus Areas

  • Data Engineering: Designing, building, and optimising scalable data pipelines to enhance stability, traceability, and refresh frequency
  • Data Mapping & Metadata Capture: Onboarding critical financial systems into a centralised enterprise data catalogue and supporting governance initiatives

Work will be delivered in an Agile/Scrum model, collaborating closely with data product managers and cross-functional teams.

Platform Overview

The platform is built on Microsoft Azure and Microsoft Fabric, with an existing in-house Finance Data Catalogue (FDC).

This role will contribute to scaling the platform into a production-grade DataOps ecosystem.

Key Responsibilities

  • Design, build, optimise, and maintain scalable data pipelines
  • Develop and manage ELT pipelines, orchestration, and automation using Python
  • Capture and onboard metadata into enterprise data catalogues
  • Perform data analysis to identify patterns, anomalies, and quality improvements
  • Contribute to data product design, including data models and structures
  • Design and implement data models for analytics and reporting
  • Build and maintain CI/CD pipelines for automated deployment
  • Develop automation scripts using Python, Bash, and PowerShell
  • Develop and maintain RESTful APIs for integration and interoperability
  • Implement data quality checks and validation frameworks
  • Set up monitoring, alerting, and performance tracking
  • Develop dashboards and reporting solutions using Microsoft Fabric & Power BI
  • Collaborate with stakeholders to translate business needs into technical solutions
  • Ensure compliance with data governance, privacy, and security standards
  • Maintain technical documentation and communicate with both technical and business teams

Mandatory Requirements

Candidates must have recent (last 48 months) hands-on production experience in:

  • Python (data pipelines, scripting, automation)
  • SQL (data transformation, modelling)
  • Microsoft Azure & Microsoft Fabric
  • ELT Pipeline Development
  • Git (version control & collaboration)
  • Metadata Management & Data Catalogue onboarding
  • Data Product Design
  • Agile/Scrum methodologies
  • Data Analysis & Data Quality Assessment

Technical Skills Required

Cloud & Platform

  • Microsoft Azure
  • Microsoft Fabric

Data Engineering

  • Python, SQL
  • Bash, PowerShell

Pipelines & DevOps

  • ELT pipelines
  • CI/CD pipelines

Data Modelling

  • Schema design
  • Metadata management
  • Data product design

API Development

  • RESTful APIs

Analytics & Reporting

  • Power BI
  • Microsoft Fabric Reporting

Version Control

  • Git, GitHub

Methodology

  • Agile / Scrum

 Nice to Have

  • Advanced Power BI (DAX, dashboarding, modelling)
  • Data visualization beyond standard reporting
  • Experience with REST API integrations
  • Exposure to:
    • Azure Synapse Analytics
    • Azure Data Lake
    • Azure DevOps
  • Experience in government/public sector or regulated environments
  • Familiarity with data governance frameworks
  • Experience in financial systems modernisation programs

 



Data Engineer Related jobs

Other jobs at Evnek

We help you get seen. Not ignored.

We help you get seen faster — by the right people.

🚀

Auto-Apply

We apply for you — automatically and instantly.

Save time, skip forms, and stay on top of every opportunity. Because you can't get seen if you're not in the race.

AI Match Feedback

Know your real match before you apply.

Get a detailed AI assessment of your profile against each job posting. Because getting seen starts with passing the filters.

Upgrade to Premium. Apply smarter and get noticed.

Upgrade to Premium

Join thousands of professionals who got noticed and hired faster.