Bachelor's degree in Computer Science, Engineering, or related field or equivalent work experience., At least 4 years of overall IT experience., Hands-on experience in Python and PySpark., Experience with version control tools like Git and cloud services such as AWS..
Key responsibilities:
Develop and optimize PySpark applications using Spark Dataframes.
Work with big data processing and cloud-based analytics services like Amazon EMR and Lambda.
Collaborate in automated build, test, and deployment pipelines using tools like Jenkins.
Debug and troubleshoot data processing jobs to ensure performance and reliability.
Report this Job
Help us maintain the quality of our job listings. If you find any issues
with this job post, please let us know. Select the reason you're reporting
this job:
Coders Brain is a global leader in IT services, digital and business solutions that partners with its clients to simplify, strengthen and transform their businesses. We ensure the highest levels of certainty and satisfaction through a deep-set commitment to our clients, comprehensive industry expertise and a global network of innovation and delivery centers.
We achieved our success because of how successfully we integrate with our clients.