GN - Data Engineer (Databricks) -111

Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

Minimum of 2 years of hands-on experience with Databricks and AWS., Strong proficiency in Python programming and experience with Apache Spark., Solid understanding of AWS technologies such as EC2, S3, and SQS., Proficiency in SQL and experience with DevOps processes..

Key responsabilities:

  • Design, develop, and maintain scalable data pipelines using Databricks.
  • Write and deploy Python code for ETL processes and production environments.
  • Collaborate with data scientists to meet their data requirements.
  • Monitor and troubleshoot data pipelines and system performance.

Thaloz logo
Thaloz Computer Software / SaaS Scaleup https://thaloz.com/
51 - 200 Employees
See all jobs

Job description

We're seeking a talented and motivated Databricks Engineer to join our growing data engineering team. In this role, you will be responsible for designing, developing, and deploying data pipelines and solutions using Databricks, Python, and AWS services. You will play a key role in building and maintaining our data infrastructure to support our business needs.

Responsibilities:

  • Design, develop, and maintain scalable and reliable data pipelines using Databricks.
  • Write and deploy Python code for data processing and ETL (Extract, Transform, Load) processes.
  • Package and deploy Python code effectively for production environments.
  • Utilize AWS foundation technologies such as EC2, S3, and SQS to build robust data solutions.
  • Implement and adhere to DevOps processes and utilize relevant tools for continuous integration and continuous deployment (CI/CD).
  • Write and execute SQL queries for data analysis and manipulation.
  • Collaborate with data scientists and analysts to understand their data requirements and provide necessary data infrastructure.
  • Monitor and troubleshoot data pipelines and system performance.
  • Stay up-to-date with the latest technologies and best practices in data engineering and cloud computing.

Requirements

  • Experience using Python with Apache Spark within the Databricks environment.  
  • Experience working with Databricks hosted on AWS
  • Minimum of 2 years of hands-on experience working with Databricks.  
  • Strong proficiency in Python programming language.
  • Experience with packaging and deployment of Python code.
  • Solid understanding of AWS foundation technologies (EC2, S3, SQS, etc.).
  • Experience with DevOps processes and tools.
  • Proficiency in SQL language.
  • Excellent problem-solving and analytical skills.
  • Strong communication and collaboration skills.

Required profile

Experience

Industry :
Computer Software / SaaS
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Analytical Thinking
  • Collaboration
  • Communication
  • Problem Solving

Data Engineer Related jobs