Match score not available

Sr Data Ingestion Engineer - LATAM

Remote: 
Full Remote
Contract: 
Experience: 
Mid-level (2-5 years)
Work from: 
South Africa

Offer summary

Qualifications:

Over 3 years of experience in data engineering, Bachelor's degree in related field, Hands-on experience with Spark/Scala and SQL, Experience with Databricks and Airflow, Familiarity with CI/CD pipelines and Agile methodology.

Key responsabilities:

  • Ingest data from various source systems
  • Manage ETL/ELT pipelines on Databricks
  • Optimize data pipelines for speed and scalability
  • Document ingestion flows and data catalogs
  • Ensure compliance with security guidelines

Job description

This role will be responsible for data ingestion and platform management on the Databricks platform. The position requires a deep understanding of:

  • Data Lake ingestion processes and best practices
  • ETL/ELT implementation
  • CI/CD
  • System integration tools
  • Data pipeline management

Responsibilities:

  • Ingest data from a variety of source systems and adapt ingestion approaches based on the system.
  • Manage, maintain, and monitor ETL/ELT pipelines on the Databricks platform.
  • Optimize data pipelines for scalability and speed.
  • Document ingestion and integration flows as well as pipelines.
  • Use Airflow to schedule and automate ingestion jobs.
  • Manage metadata and master data in a technical data catalog.
  • Ensure that ELT/ETL designs comply with required security and compliance guidelines, and manage PII, tagging, and risk assessment during ingestion.
  • Maintain ETL/ELT pipeline infrastructure and implement automated monitoring strategies.
  • Ensure compliance with SDLC best practices.

Profile:

  • Over 3 years of experience in data engineering, ingestion pipelining, and ETL/ELT.
  • Bachelors degree in Computer Science, Engineering, Statistics, or a related field.
  • Hands-on experience and understanding of the following:
    • Spark/Scala
    • SQL
    • Python/PySpark or a similar programming language
    • Databricks
    • Unity Catalog
    • ETL/ELT development, monitoring, and pipelining using tools like Apache Airflow
    • Ingestion tools such as Dell Boomi
    • Data quality guidelines
    • CI/CD pipelines
    • Agile methodology
    • Git and version control

What our client offer:

* Remote job, USD payment, local holidays and career growth

Required profile

Experience

Level of experience: Mid-level (2-5 years)
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Data Engineer Related jobs