Data Engineer Big Data Triangle

Work set-up: 
Full Remote
Contract: 
Experience: 
Mid-level (2-5 years)
Work from: 

Offer summary

Qualifications:

Proven experience of 4-5 years in Big Data technologies., Strong skills in Java, Hadoop, Apache Spark, and Apache Kafka., Knowledge of databases such as MongoDB, PostgreSQL, and Delta Lake., Bachelor's degree or higher in Computer Science or related field..

Key responsibilities:

  • Develop and maintain Big Data solutions using Java, Hadoop, Spark, and Kafka.
  • Collaborate with teams to design scalable data pipelines.
  • Ensure data quality and optimize data processing workflows.
  • Work on-site at IBM location from day one.

CodersBrain logo
CodersBrain SME https://www.codersbrain.com/
201 - 500 Employees
See all jobs

Job description

F2F required at IBM location Resource has to work from office from day one Total Years of Experience 5 to 7 Relevant years of Experience 4 to 5 Mandatory Skills Java, Hadoop, Big data, Apache Spark, Apache Kafka, Alternate Skills Mongo DB, PostgreSQL, Delta Lake. Good to have (Not Mandatory) Detailed Job Description Java, Hadoop, Big data, Apache Spark, Apache Kafka, Mongo DB, PostgreSQL, Delta Lake. Location Any location RTH Y

Required profile

Experience

Level of experience: Mid-level (2-5 years)
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Data Engineer Related jobs