Offer summary
Qualifications:
Minimum 5 years experience in Data Engineering, Expertise in cloud technologies (GCP, AWS, Azure), Proficient in SQL/noSQL databases, BigQuery, Python, Pyspark, Experience in ETL solutions within DWHs, software engineering of DWH/BI.
Key responsabilities:
- Design, develop, and maintain data infrastructure and pipelines
- Lead data integration, optimization, and performance enhancement
- Evaluate and implement new technologies for enhanced capabilities
- Collaborate with cross-functional teams to meet data needs
- Coach and lead an international DevOps-Team of engineers