Offer summary
Qualifications:
Bachelor's degree or equivalent experience, 5+ years in Data Engineering or ETL role, Strong experience with PySpark, Python, Knowledge of Hadoop ecosystems, SQL, Linux/Unix, Experience with AWS cloud technology preferred.
Key responsabilities:
- Create and support complex systems and processes
- Develop applications, reports, and business rules
- Estimate work efforts and respond to vendor RFPs
- Fulfill end-user requests and provide guidance
- Lead task delivery under deadlines in cross-functional teams