Match score not available

Data Engineer with GCP

Remote: 
Full Remote
Contract: 
Experience: 
Mid-level (2-5 years)
Work from: 

Offer summary

Qualifications:

Strong understanding of cloud computing concepts, Experience with GCP, Python, and SQL, Knowledge of data modeling, specifically BigQuery, Experience with Cloud Storage, Dataflow, and Spark, Knowledge of Infrastructure as Code using Terraform.

Key responsabilities:

  • Design, build, and deploy distributed systems
  • Maintain architecture patterns for data processing
  • Establish automated processes for data analysis
  • Collaborate with analysts and data scientists
  • Monitor performance and advise on infrastructure changes
Devire logo
Devire Human Resources, Staffing & Recruiting SME https://www.devire.pl/
201 - 500 Employees
See more Devire offers

Job description

Logo Jobgether

Your missions

Your future company

Devire Outsourcing IT is a form of a partnership dedicated to self-employed IT specialist, executing projects for our Clients - leading IT Companies bringing innovations and the newest resolutions to market.

For our Client, a Polish software house leading IT solutions provider, we are looking for Data Engineer, who will join to the international team developing photolithography systems for semiconductor manufacturing.

Requirements

  • Strong understanding of cloud computing concepts and experience with GCP.
  • Proficiency in Python, additionally, SQL.
  • Knowledge of data modelling, including experience with BigQuery.
  • Analytical mindset.
  • Experience with Cloud Storage, Cloud Dataflow.
  • Experience with data pipeline design and implementation on Spark.
  • Experience with maintaining data quality and governance.
  • Knowledge about Infrastructure as Code (Iac) using Terraform.
  • Experience with CI/CD for data pipelines.

Responsibilities

  • Responsible for at-scale infrastructure design, build and deployment with a focus on distributed systems.
  • Building and maintaining architecture patterns for data processing, workflow definitions, and system to system integrations using Big Data and Cloud technologies.
  • Establishing scalable, efficient, automated processes for data analysis, data model development, validation, and implementation.
  • Working closely with analysts/data scientists to understand impact to the downstream data models.
  • Monitoring performance and advising any necessary infrastructure changes.
  • Responsible for data analytics model development (Python, Spark).

The offer

  • Salary based on B2B contract
  • 100% remote work
  • Working for a leading corporation with a stable market position
  • Benefits package
  • Cooperation with international team

Required profile

Experience

Level of experience: Mid-level (2-5 years)
Industry :
Human Resources, Staffing & Recruiting
Spoken language(s):
Check out the description to know which languages are mandatory.

Soft Skills

  • Analytical Thinking

Data Engineer Related jobs