Offer summary
Qualifications:
3+ years of experience in software development or Big Data, Excellent knowledge of Python and PySpark, Experience with Data Lakes or Data Warehouses, Good knowledge of AWS and Infra-as-Code.Key responsabilities:
- Build, optimize, and maintain ETL pipelines
- Analyze requirements and handle coding, testing, and deployment