Match score not available

DataOps Engineer

Remote: 
Full Remote
Contract: 
Experience: 
Senior (5-10 years)
Work from: 

Offer summary

Qualifications:

3-6 years experience in data engineering or related field, Proficient in AWS services and DevOps tools, with expertise in Python/Java and SQL.

Key responsabilities:

  • Design, develop, and maintain data pipelines for various sources
  • Collaborate with cross-functional teams to transform and analyze data
futureproof consulting logo
futureproof consulting Startup https://fproof.eu/
2 - 10 Employees
See more futureproof consulting offers

Job description

Logo Jobgether

Your missions

This is a remote position.

We are seeking a skilled Data Engineer with strong experience in the security field to join our pharmaceutical client’s team. 
  • Design, develop, and maintain data pipelines to extract data from various sources, populating data lakes and data warehouses.
  • Develop and implement data transformation rules.
  • Collaborate with Product Analysts, Data Scientists, and Engineers to identify and transform data to make it more understandable and actionable.
  • Work with the data governance team to implement data quality checks and maintain data catalogs.
  • Utilize orchestration, logging, and monitoring tools to build resilient data pipelines.
  • Apply test-driven development methodologies when building ELT/ETL pipelines.
  • Analyze and interpret data to support business decision-making.
  • Use Git for version control and implement various branching strategies.
  • Contribute as an active member of an agile development team.
  • Create and maintain comprehensive technical documentation.


Requirements
  • 3-6 years of relevant experience in data engineering or a related field.
  • Solid experience with AWS services: S3, IAM, Redshift, Sagemaker, Glue, Lambda, Step Functions, CloudWatch.
  • Experience with data platforms such as Databricks and Dataiku.
  • Proficiency in Python and/or Java.
  • Strong knowledge of SQL, preferably with experience in Redshift.
  • Experience with DevOps tools: Jenkins, CloudFormation, Terraform, Git, Docker.
  • 2-3 years of experience with Spark, especially PySpark.
  • Ability to work effectively in cross-functional teams and agile environments.


Benefits
  • Location: Czechia
  • Freelance
  • Working for a global pharmaceutical leader.
  • Long-term project.


Required profile

Experience

Level of experience: Senior (5-10 years)
Spoken language(s):
Check out the description to know which languages are mandatory.

Soft Skills

  • collaboration
loading