Match score not available

Data Engineer(Pyrhon, Pyspark,AWS)_ STuti Tripathi

Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

6+ years of experience, Proficient in Python 3.9.1, Pyspark 3.X, Experience with AWS services (EMR, S3, EC2), Familiarity with Postgres V13, Knowledge of tech stacks like Python, AWS.

Key responsabilities:

  • Develop and deploy Spark applications on AWS EMR
  • Build workflows using state machine and step function on AWS
  • Work with Jenkins, Bitbucket/Git, Apache Airflow
  • Configuration of spark clusters and leveraging AWS Glue service
  • Provide limited ML support as needed
CodersBrain logo
CodersBrain Management Consulting SME https://www.codersbrain.com/
201 - 500 Employees
See more CodersBrain offers

Job description

Hello Candidate,

We are hiring for Data engineer.

Location:- Remote
Exp- 6+ yrs
Notice period- Immediate to 30 days
Skills:-

 Experience with Python 3.9.1
 Should have solid experience on Pyspark 3.X
 Build the workflow in AWS using the state machine and step function
 Deploy and Run Spark Application in AWS EMR and have experienced with S3,EC2
 Should have Worked on Postgres V13
 Should have solid experience working on various tech stacks like Python, AWS,
Apache Airflow
 Should have experience with Jenkins, Bitbucket/Git
 spark cluster configuration experience required other than leveraging the AWS Glue
service.
 Limited to no ML – LDI requires someone who can help in that area (doesn’t have to
be an expert).
 Multi-threading/multi-processing and shallow vs deep copying
 Additional Skills: Any Core Java skills will be a bonus but not mandatory

Interested Candidate can send their resume at rini.adhya@codersbrain.com

Required profile

Experience

Industry :
Management Consulting
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Data Engineer Related jobs