Match score not available

Tech Specialist MAHIN-JOB-37180

Remote: 
Full Remote
Contract: 
Experience: 
Mid-level (2-5 years)
Work from: 

Offer summary

Qualifications:

Proven experience with GCP services, Proficiency in SQL and Python, Understanding of data warehousing concepts.

Key responsabilities:

  • Develop data pipelines using GCP services
  • Implement and manage data warehouse solutions
  • Ensure data quality and integrity
  • Provision infrastructure with Terraform
  • Optimize data storage and retrieval processes
Keylent Inc logo
Keylent Inc Information Technology & Services SME https://www.keylent.com/
201 - 500 Employees
See more Keylent Inc offers

Job description

Logo Jobgether

Your missions

Tech Specialist MAHIN-JOB-37180
Position :- GCP Data Engineer

Location :- Remote
FULL TIME

We are seeking a highly skilled GCP Data Engineer to join our team. The ideal candidate will have extensive experience with Google Cloud Platform (GCP) services, particularly Dataflow, BigQuery, Pub/Sub, and related services. Additionally, the candidate should be proficient in provisioning infrastructure on GCP using Infrastructure as Code (IaC) with Terraform. This role will be critical in building and managing data pipelines, ensuring data quality, and optimizing data storage and retrieval processes.

Responsibilities:
• Design, develop, and maintain data pipelines using GCP services such as Dataflow, BigQuery, and Pub/Sub.
• Implement and manage data warehouse solutions using BigQuery, ensuring data is stored securely and efficiently.
• Use Pub/Sub for real-time data ingestion and streaming analytics.
• Provision and manage GCP infrastructure using Terraform, ensuring best practices in IaC are followed.
• Optimize data storage and retrieval processes to enhance performance and reduce costs.
• Monitor and troubleshoot data pipeline issues, ensuring high availability and reliability of data services.
• Ensure data quality and integrity through robust testing and validation processes.
• Stay updated with the latest GCP features and best practices, integrating them into existing workflows.
• Document data workflows, infrastructure setups, and processes for future reference and knowledge sharing.

Requirements:
• Proven experience as a Data Engineer with a focus on GCP services.
• Strong proficiency in GCP services such as Dataflow, BigQuery, and Pub/Sub.
• Hands-on experience with Terraform for provisioning and managing GCP infrastructure.
• Proficiency in SQL and Python for data manipulation and analysis.
• Solid understanding of data warehousing concepts and ETL processes.
• Experience with real-time data processing and streaming analytics.
• Strong problem-solving skills and attention to detail.
• Excellent communication and collaboration skills.

Preferred Qualifications:
• Google Cloud Professional Data Engineer certification.
• Experience with other GCP services like Cloud Storage, Cloud Functions, and Cloud Composer.
• Familiarity with other IaC tools such as Ansible or Cloud Deployment Manager.
• Experience with containerization technologies like Docker and Kubernetes.
• Knowledge of data security and governance practices.

Required profile

Experience

Level of experience: Mid-level (2-5 years)
Industry :
Information Technology & Services
Spoken language(s):
Check out the description to know which languages are mandatory.

Soft Skills

  • Problem Solving
  • verbal-communication-skills
  • Organizational Skills
  • governance

Data Engineer Related jobs