Azure Data Engineer (ETL Pipelines & Middleware Integration)

Work set-up: 
Full Remote
Contract: 
Experience: 
Senior (5-10 years)
Work from: 

Offer summary

Qualifications:

6+ years of experience in data engineering, focusing on ETL pipelines and Azure technologies., Proficiency in Azure Data Factory or Airflow, SQL, and Python for data transformations., Familiarity with middleware synchronization tools and writeback queue design., Experience working with scheduling, payroll, estimating, or financial datasets is preferred..

Key responsibilities:

  • Develop and maintain automated ETL pipelines for critical data workflows.
  • Clean and validate structured files into SQL staging layers.
  • Transform data into system-ready formats and manage outbound sync queues.
  • Collaborate with developers and architects to ensure transformation logic aligns with workflows.

Full Scale logo
Full Scale
201 - 500 Employees
See all jobs

Job description

This is a remote position.

Join one of the Philippines’ fastest-growing tech companies!

Company Overview:

Full Scale is a tech services company that helps businesses build dedicated teams of skilled software engineers. We make finding and retaining experienced software talent easy and affordable.

Job Description:

We’re looking for a Senior Data Engineer to build and maintain automated ETL pipelines that support business-critical data workflows. You’ll work with structured files, transform and validate data, and ensure seamless integration across systems.


Key Responsibilities:

  • Develop and maintain automated pipelines for extracting, transforming, and loading business-critical data across systems.

  • Clean and validate structured files (e.g., time logs, estimates, receipts) into normalized SQL staging layers.

  • Convert transformed records into system-ready formats and feed outbound sync queues.

  • Ensure auditability and retry handling for all data transfer jobs.

  • Collaborate with developers and architects to align transformation logic with end-user workflows.



Requirements
  • 6+ years in data engineering, with a focus on ETL pipelines and Azure technologies

  • Proficient in Azure Data Factory or Airflow, SQL, and Python for data transformations

  • Familiar with middleware sync tools and writeback queue design

  • Experience working with scheduling, payroll, estimating, or financial datasets preferred

  • Strong understanding of Agile delivery cycles, and experience managing work via Jira

  • Ability to work iteratively, validate outputs in cycles, and support backlog-driven delivery



Benefits
  • Permanent Work-from-home or work-anywhere setup

  • Work-from-home allowance

  • Health Insurance on day 1 of employment with free three (3) dependents

  • Group Term Life Insurance

  • A laptop and other equipment

  • Other top benefits


This role is open to Philippine-based candidates only.​


Required profile

Experience

Level of experience: Senior (5-10 years)
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Scheduling
  • Collaboration

Data Engineer Related jobs