Offer summary
Qualifications:
Bachelor's or Master's degree in Computer Science, Data Science, or a related field., 5+ years of experience in data engineering, with a focus on data architecture, ETL, and database management., Proficiency in Python/Pyspark, Java/Scala, big data technologies (Hadoop, Spark, Kafka), SQL., Experience with database technologies (PostgreSQL, MySQL, NoSQL), cloud platforms (AWS, Azure, GCP)., Experience in building end-to-end data pipelines, CICD tools (Jenkins, Airflow), and Agile environments..
Key responsabilities:
- Lead data architecture/design, data pipeline development/optimization, and database management.
- Ensure data quality/governance, provide mentorship/leadership, collaborate with stakeholders.
- Monitor performance/optimization, contribute to software applications' delivery, conduct code reviews.
- Actively contribute to ongoing tool refinement, drive adoption of new practices within the team.
- Take ownership of issues, driving multiple features/projects independently while leading the technology effort.