Offer summary
Qualifications:
Bachelor’s degree in Computer Science, Engineering or equivalent, 4+ years of IT experience, Hands-on experience in Python and PySpark, Experience with version control tools like Git, Familiarity with Amazon services like EMR, Lambda, EC2, S3, etc.
Key responsabilities:
- Building pySpark applications using Spark Dataframes
- Optimizing spark jobs processing large data volumes
- Creating and maintaining bash/shell scripts
- Working on fixed width, delimited, multi-record file formats
- Utilizing tools like Jenkins for building, testing, and deploying applications