Data Engineer

Remote: 
Full Remote
Contract: 
Work from: 
Israel

Offer summary

Qualifications:

Bachelor's or Master's degree in Computer Science, Engineering, or related field., At least 3 years of experience in data engineering., Expertise in Elasticsearch, cloud technologies (AWS, Azure, GCP), Kafka, and Databricks., Proficiency in programming languages such as Python, Java, or Scala..

Key responsibilities:

  • Design and develop data platforms using Elasticsearch, Databricks, and Kafka.
  • Build and maintain scalable, reliable data pipelines.
  • Collaborate with teams to identify data requirements and design solutions.
  • Implement data quality checks and troubleshoot pipeline issues.

KPMG Israel logo
KPMG Israel Financial Services Large https://kpmg.co.il/
1001 - 5000 Employees
See all jobs

Job description

Description

We are KPMG’s technology arm in Israel. KPMG delves headfirst into the power of emerging technologies and scientific breakthroughs to craft solutions, projects, and products for companies facing complex business challenges in today’s continuously changing world. By uniting groundbreaking technology with industry expertise, we are able to harness the potential of cloud, AI, ML, digital, and cyber to design and implement top-of-the-line tailored solutions.

We are seeking a skilled and motivated Data Engineer with expertise in Elasticsearch, cloud technologies, and Kafka. As a data engineer, you will be responsible for designing, building and maintaining scalable and efficient data pipelines that will support our organization's data processing needs.

The role will entail:

  • Design and develop data platforms based on Elasticsearch, Databricks, and Kafka
  • Build and maintain data pipelines that are efficient, reliable and scalable
  • Collaborate with cross-functional teams to identify data requirements and design solutions that meet those requirements
  • Write efficient and optimized code that can handle large volumes of data
  • Implement data quality checks to ensure accuracy and completeness of the data
  • Troubleshoot and resolve data pipeline issues in a timely manner

Requirements

  • Bachelor's or Master's degree in Computer Science, Engineering, or a related field
  • 3+ years of experience in data engineering
  • Expertise in Elasticsearch, cloud technologies (such as AWS, Azure, or GCP), Kafka and Databricks
  • Proficiency in programming languages such as Python, Java, or Scala
  • Experience with distributed systems, data warehousing and ETL processes
  • Experience with Container environment such AKS\EKS\OpenShift is a plus
  • high security clearance is a plus

The position is open for all genders as well as people with disabilities.


Required profile

Experience

Industry :
Financial Services
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Collaboration
  • Problem Solving

Data Engineer Related jobs