Offer summary
Qualifications:
Degree in relevant field or equivalent education, Several years of experience in Data Engineering, Proficient in Python and SQL, Experience with Cloud (preferably AWS, GCP) and/or Big Data frameworks like Hadoop, Spark, Kafka, Fluent in German and English.
Key responsabilities:
- Design and implement scalable data platforms on AWS and/or GCP
- Integrate data sources and implement data pipelines
- Engage in team collaboration and standards development
- Engineer container-based analytics platforms and CI/CD pipelines
- Prepare data for dashboards and AI use cases