Offer summary
Qualifications:
Knowledge of Big Data architecture (AWS, Cloudera, Databriks), Experience in cloud projects, preferably AWS, Development experience with Spark micro-batch and Spark streaming, Proficiency in Scala, Python, Java, Advanced knowledge of ETL tools and relational modeling.Key responsabilities:
- Participate in global projects
- Ensure data replication solutions
- Develop data analysis technologies
- Contribute to Machine Learning projects
- Work on Data Lake implementations