Offer summary
Qualifications:
8+ years of experience in data processing pipelines, Strong proficiency in AWS services, Experience with Databricks and Pyspark, Solid understanding of database design principles, Familiarity with version control systems and CI/CD.
Key responsabilities:
- Design, develop, and deploy data pipelines on AWS
- Implement data processing workflows using Databricks and SQL
- Build orchestration workflows with Apache Airflow
- Collaborate with teams to understand data needs
- Optimize data pipelines for performance and cost-effectiveness