Match score not available

Data Engineer - Remote

Remote: 
Full Remote
Contract: 
Experience: 
Mid-level (2-5 years)
Work from: 

Offer summary

Qualifications:

Experience in data query languages (SQL), Proficient with BigQuery and data formats (Parquet, Avro), Skilled in design and maintenance of large-scale data infrastructure, Familiarity with CI/CD and cloud infrastructure on GCP, Understanding of software architecture and design patterns.

Key responsabilities:

  • Develop data products for a data transformation project
  • Implement data-intensive solutions for the organization
  • Collaborate within cross-functional agile teams
  • Ensure data quality and accessibility across brands
  • Maintain cloud infrastructure and pipelines efficiently
Agile Search logo
Agile Search https://www.agilesearch.io/
11 - 50 Employees
See more Agile Search offers

Job description

For a client we are looking for a Data Engineer. This is a consulting assignment starting in October and ending in March. You can work 100% remote.

Assignment Description

You will be involved in one of the biggest data transformation projects. As a data engineer, you will be working with building data products in the context of the Data Mesh concept based on defined target vision and requirements.
We appreciate a multitude of technical backgrounds, and we believe you will enjoy working here if you are passionate about data.

In this role, you will be required to implement data-intensive solutions for a data-driven organisation.
You will join the Data Engineering Competence area within AI (Artificial Intelligence), Analytics & Data Domain and be an individual contributor in one of the data product-teams. The area supports all our brands globally to create, structure, guard and ensure data is available, understandable and of high quality.

Your Profile
  • Experience in data query languages (SQL or similar), BigQuery, and different data formats (Parquet, Avro)
  • Take end-to-end responsibility to design, develop and maintain the large-scale data infrastructure required for the machine learning projects
  • Have the DevOps mindset and principles to manage CI/CD pipelines and terraform as well as Cloud infrastructure, in our context, it is GCP (Google Cloud Platform).  
  • Leverage the understanding of software architecture and software design patterns to write scalable, maintainable, well-designed, and future-proof code
  • Work in cross-functional agile team of highly skilled engineers, data scientists, business stakeholders to build the AI ecosystem within our client's organisation
  • Required profile

    Experience

    Level of experience: Mid-level (2-5 years)
    Spoken language(s):
    English
    Check out the description to know which languages are mandatory.

    Other Skills

    • Problem Solving
    • Collaboration

    Data Engineer Related jobs