Offer summary
Qualifications:
Degree in IT field or equivalent, 2 years of programming experience with PySpark, Proficiency in SQL and Python, Experience with Apache Spark clusters, Ability to handle large datasets.Key responsabilities:
- Develop predictive risk models
- Process data in real-time and batches
- Manipulate and analyze large databases
- Optimize code for analytical processes
- Work on innovative technology projects