Offer summary
Qualifications:
Master's in Big Data or Computer Engineering with strong data interest, Proficiency in SQL and NoSQL databases, as well as related concepts, Experience with collaborative development tools such as Git and Jupyter Notebooks, Familiarity with Big Data stack (Airflow, Spark, Hadoop), Preferred knowledge of AWS services (Lambda, EMR, S3).
Key responsabilities:
- Optimizing Datalake maintenance and updating data flows
- Developing data pipelines for analysis in collaboration with BI and Data Science teams
- Leading end-to-end project management for clients: data collection, preprocessing, modeling, deployment
- Suggesting new solutions, participating in technical qualifications, and enhancing data infrastructure
- Creating and implementing reporting tools like Power BI, managing metadata, and documentation