Senior Data Engineer - (Remote)

Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

6+ years of hands-on experience in data engineering., Proficient in Python, dbt, and Airflow or Dagster., Strong knowledge of RDBMS and in-memory databases like PostgreSQL and Redis., Familiarity with cloud ecosystems such as AWS, Azure, or GCP is preferred..

Key responsibilities:

  • Transform legacy Talend ETL pipelines into scalable workflows using Python, dbt, and Airflow/Dagster.
  • Maintain existing Talend pipelines and support ad-hoc and change requests.
  • Design and manage ingestion and streaming pipelines using Kafka and NiFi.
  • Implement modular, testable, and well-monitored ETL pipelines while participating in Agile collaboration.

Job description

Key Responsibilities
  • ETL Migration:
    Transform legacy Talend ETL pipelines into scalable workflows using Python, dbt, and Airflow/Dagster.

  • Pipeline Support & Maintenance:
    Maintain existing Talend pipelines and support ad-hoc and change requests.

  • Data Flow Development:
    Design and manage ingestion and streaming pipelines using Kafka and NiFi.

  • Data Storage & Processing:
    Work with formats like Parquet, Delta Lake, and Iceberg. Leverage RDBMS and in-memory databases for processing and analytics.

  • Best Practices:
    Implement modular, testable, and well-monitored ETL pipelines. Contribute to code reviews and CI/CD practices.

  • Agile Collaboration:
    Participate in Scrum ceremonies and collaborate with architects, analysts, and platform engineers.


Required Qualifications
  • 6+ years of hands-on experience in data engineering.

  • Proficient in:

    • Python (pandas, pyarrow, SQLAlchemy)

    • dbt (data modeling, testing, documentation)

    • Airflow or Dagster (DAG scheduling and orchestration)

    • Kafka & Apache NiFi (streaming data ingestion)

  • Experience with Parquet, Delta Lake, Apache Iceberg.

  • Strong knowledge of RDBMS and in-memory databases (e.g., PostgreSQL, Redis, MemSQL).

  • Familiar with Git, CI/CD pipelines, logging, error handling.

  • Understanding of Agile/Scrum processes and data governance.


Preferred Qualifications
  • Experience with cloud ecosystems (AWS, Azure, or GCP).

  • Familiarity with Docker, Kubernetes.

  • Exposure to DataOps/MLOps principles.

Required profile

Experience

Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Collaboration
  • Problem Solving

Data Engineer Related jobs