Offer summary
Qualifications:
Bachelor's degree in Computer Science or Engineering, Master’s degree preferred., 5+ years of experience with data pipelines (Hadoop/Spark/Pig), Strong SQL proficiency, Scala/Java coding skills, Knowledge of systems like Hadoop, Spark, Presto, Airflow, Experience in AWS cloud is a plus..
Key responsabilities:
- Conceptualize data architecture for large-scale projects
- Ensure data quality and excellence
- Optimize data pipelines and systems
- Conduct SQL data investigations and optimizations
- Mentor team members, facilitate code reviews