Match score not available

Big Data Developer

Remote: 
Full Remote
Contract: 
Experience: 
Mid-level (2-5 years)
Work from: 

Offer summary

Qualifications:

6-8 years AWS experience, Proficiency in Spark, Scala, ETL.

Key responsabilities:

  • Design and implement ETL processes
  • Orchestrate big data transformations

Job description

"Must have at least 6 to 8 years' experience handling AWS Services, ETL Proficiency in Spark and Scala for big data processing Data engineering experience across AWS cloud platform Knowledge on Cloud based Data Lakes exposure such as EMR Databricks Redshift Should have experience working in Data engineer platform Data mining - standardize the interfaces dealing with huge data. They should be able to orchestrate the Big data transformations (CICD / Strong pipeline in Cloud preferable Aws)GitHub, Jenkins CI/CD, JUnit and Docker Create, test and execute SQL language code Design and implement ETL process to extract, transform and load data into data warehouses and data lakes Hands on experience with dealing global customers to gather requirements and translate them into solutions using the necessary skills "

Required profile

Experience

Level of experience: Mid-level (2-5 years)
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Data Engineer Related jobs