Offer summary
Qualifications:
10-15+ years in data engineering, Strong Python and PySpark skills, Expertise with Delta Lakes architecture, Experience with Jupyter Notebooks, Familiarity with AWS, Azure or GCP.Key responsabilities:
- Design and maintain data pipelines using Python and Spark.
- Implement Delta Lake architecture for data integrity.
- Collaborate on creating reusable Jupyter Notebooks.
- Optimize data storage processes for high performance.
- Monitor and improve data processing performance.