Location: Remote- Offshore
Duration: 12-month contract
We are seeking a highly skilled Data Engineer with 8+ years of experience in developing and managing data pipelines and ETL/ELT solutions. The ideal candidate will have a strong background in modern data warehousing architectures, Big Data, and Cloud platforms.
Design, develop, and maintain data pipelines to support large-scale data processing.
Architect modern data warehousing solutions using technologies like Big Data, Cloud, and Kafka.
Work with cloud platforms (preferably GCP – BigQuery, Dataflow, Pub/Sub, Data Fusion) for data migration from on-premise to the cloud.
Develop Python-based data extraction and transformation (ETL/ELT) processes.
Implement ETL solutions using Informatica or similar tools.
Work within an Agile development environment (Scrum, Kanban).
Implement CI/CD pipelines for efficient data deployment.
Create and maintain technical documentation for data workflows.
Work with modern database concepts (e.g., BigQuery, Redshift).
Develop and manage Airflow DAGs for data orchestration.
Perform BI and Data Analysis to ensure the quality and accuracy of data.
Identify and resolve potential issues before they impact business operations.
Collaborate with offshore teams in an onsite-offshore model.
8+ years of experience in data engineering and building scalable data pipelines.
5+ years of experience in data warehouse architecture and modern data platforms.
Strong expertise in Python for data transformation and ETL development.
Hands-on experience with Informatica or other ETL tools.
Strong understanding of Big Data technologies and Kafka.
Experience with GCP (Google Cloud Platform) and data migration from on-prem to cloud.
Proficiency in CI/CD concepts and Agile development methodologies.
Familiarity with modern database technologies like BigQuery and Redshift.
Experience with Apache Airflow and DAG development.
Strong problem-solving skills and the ability to fix issues proactively.
Prior experience in coordinating offshore teams and working in a global delivery model.
Experience with AWS/Azure cloud environments.
Knowledge of streaming data frameworks.
Experience in data governance and security best practices.
Devoteam
Dataroid
Carglass® Germany
fabric
Floy