Offer summary
Qualifications:
3+ years in software development or Big Data, Excellent knowledge of Python, Proficient in PySpark and Spark, Experience with Data Lakes or Warehouses, preferably Snowflake.Key responsabilities:
- Build, optimize, and maintain ETL pipelines
- Analyze requirements and handle coding, testing, debugging, deployment, maintenance