Offer summary
Qualifications:
Bachelor’s degree in Computer Science or similar field, 8+ years experience in Data Engineering + several years in Analytics space, Proficiency in Scala, Apache Spark, Kafka, ADF, PySpark, SQL queries, and Python programming, Experience with Azure stack products, Delta Lake, ETL processing, building data pipelines, and harmonizing data.
Key responsabilities:
- Combine data from various sources to align data systems with business objectives
- Build data pipelines for real-time streaming using Kafka, ADF, and API integration
- Wrangle and transform raw data into user-friendly formats using Azure Databricks
- Develop ingestion pipelines to handle structured and unstructured data at scale