Data Engineer - Mexico

Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

8+ years of experience in data engineering and building scalable data pipelines., Strong expertise in Python for data transformation and ETL development., Hands-on experience with Informatica or other ETL tools and familiarity with Big Data technologies., Experience with GCP and modern database technologies like BigQuery and Redshift..

Key responsibilities:

  • Design, develop, and maintain data pipelines for large-scale data processing.
  • Architect modern data warehousing solutions and implement ETL solutions.
  • Collaborate with offshore teams and create technical documentation for data workflows.
  • Perform BI and Data Analysis to ensure data quality and accuracy.

Saviance Technologies Pvt. Ltd. logo
Saviance Technologies Pvt. Ltd. SME https://saviance.com/
51 - 200 Employees
See all jobs

Job description

Job Title: Data Engineer

Location: Remote- Offshore

Duration: 12-month contract

Job Description:

We are seeking a highly skilled Data Engineer with 8+ years of experience in developing and managing data pipelines and ETL/ELT solutions. The ideal candidate will have a strong background in modern data warehousing architectures, Big Data, and Cloud platforms.

Key Responsibilities:
  • Design, develop, and maintain data pipelines to support large-scale data processing.

  • Architect modern data warehousing solutions using technologies like Big Data, Cloud, and Kafka.

  • Work with cloud platforms (preferably GCP – BigQuery, Dataflow, Pub/Sub, Data Fusion) for data migration from on-premise to the cloud.

  • Develop Python-based data extraction and transformation (ETL/ELT) processes.

  • Implement ETL solutions using Informatica or similar tools.

  • Work within an Agile development environment (Scrum, Kanban).

  • Implement CI/CD pipelines for efficient data deployment.

  • Create and maintain technical documentation for data workflows.

  • Work with modern database concepts (e.g., BigQuery, Redshift).

  • Develop and manage Airflow DAGs for data orchestration.

  • Perform BI and Data Analysis to ensure the quality and accuracy of data.

  • Identify and resolve potential issues before they impact business operations.

  • Collaborate with offshore teams in an onsite-offshore model.

Required Skills & Qualifications:
  • 8+ years of experience in data engineering and building scalable data pipelines.

  • 5+ years of experience in data warehouse architecture and modern data platforms.

  • Strong expertise in Python for data transformation and ETL development.

  • Hands-on experience with Informatica or other ETL tools.

  • Strong understanding of Big Data technologies and Kafka.

  • Experience with GCP (Google Cloud Platform) and data migration from on-prem to cloud.

  • Proficiency in CI/CD concepts and Agile development methodologies.

  • Familiarity with modern database technologies like BigQuery and Redshift.

  • Experience with Apache Airflow and DAG development.

  • Strong problem-solving skills and the ability to fix issues proactively.

  • Prior experience in coordinating offshore teams and working in a global delivery model.

Preferred Qualifications:
  • Experience with AWS/Azure cloud environments.

  • Knowledge of streaming data frameworks.

  • Experience in data governance and security best practices.

Required profile

Experience

Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Collaboration
  • Problem Solving

Data Engineer Related jobs