Python Pyspark

extra holidays - extra parental leave
Work set-up: 
Full Remote
Contract: 
Experience: 
Mid-level (2-5 years)
Work from: 

Offer summary

Qualifications:

Proficiency in Python and PySpark development., Experience with Hadoop and big data technologies., Knowledge of Java, Scala, and Hive programming., Familiarity with AWS services like EC2 and S3..

Key responsibilities:

  • Design and develop Hadoop applications.
  • Create and optimize PySpark jobs using Python or Scala.
  • Manage source code with Git repositories.
  • Collaborate in an agile environment to deliver software solutions.

Overture Rede logo
Overture Rede TPE https://www.overturerede.com/
11 - 50 Employees
See all jobs

Job description

This is a remote position.

Primary Skills
● Design and develop on Hadoop applications
● Hands-on in developing Jobs in pySpark with Python/ SCALA (Preferred) or Java/ SCALA
● Experience on Core Java, Experience on Map Reduce programs, Hive programming, Hive queries performance concepts
● Experience on source code management with Git repositories

Secondary Skills ● Exposure to AWS Ecosystem with hands-on knowledge of ec2, S3 and services
● Basic SQL programming ● Knowledge of agile methodology for delivering software solutions
● Build scripting with Maven / Cradle, Exposure to Jenkins


Salary:

2000000

Required profile

Experience

Level of experience: Mid-level (2-5 years)
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Related jobs