Senior Data Engineer – AWS & Big Data

Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

Strong expertise in AWS and Big Data environments., Advanced skills in Python for Data Engineering and SQL., Intermediate to advanced knowledge of Hadoop and Spark (PySpark, Scala)., Familiarity with DevOps/CI-CD tools and Git workflows..

Key responsabilities:

  • Design and implement Data Warehouse and Data Lake architectures.
  • Act as a technical leader in Data Analytics initiatives.
  • Build and maintain efficient data pipelines for data processing.
  • Support the migration of legacy systems to cloud-based solutions.

Meta IT North America logo
Meta IT North America Large https://www.metait.ca/
1001 - 5000 Employees
See all jobs

Job description

What We’re Looking For

We are seeking a Senior Data Engineer with strong expertise in AWS and Big Data environments to lead the design and implementation of scalable Data Warehouse and Data Lake solutions. This role requires a hands-on technical leader who will serve as a reference in Data Analytics initiatives, ensuring performance, quality, and security throughout the entire data lifecycle.

Key Responsibilities

  • Design and implement robust Data Warehouse and Data Lake architectures;
  • Act as a technical leader in Data Analytics solutions;
  • Define and develop data repository models tailored to various business needs;
  • Build and maintain efficient data pipelines for ingestion, transformation, and storage;
  • Support and influence architectural decisions across the Big Data ecosystem;
  • Ensure secure, reliable, and performant data processes for multiple consumers;
  • Identify technical risks and provide recommendations based on support team insights;
  • Support performance testing and optimization initiatives;
  • Contribute to the migration of legacy systems to modern cloud-based solutions.

Must-Haves

Required Skills & Experience

  • Hadoop, Spark (PySpark, Scala): Intermediate to advanced
  • Python for Data Engineering: Advanced
  • SQL: Advanced
  • Git / GitFlow workflows: Intermediate
  • AWS (CloudFormation, IAM, Lambda, SQS, Athena, EMR), Unix: Intermediate
  • DevOps/CI-CD tools (JIRA, GitLab, GitLab CI, NEXUS, SonarQube): Basic

Nice To Have

  • Experience with Apache Airflow and Docker

Required profile

Experience

Spoken language(s):
English
Check out the description to know which languages are mandatory.

Data Engineer Related jobs