Logo for Sequoia Global Services

Data Engineer - TMD

Roles & Responsibilities

  • GCP Data Engineer with Terraform, IaC: architect and design an enterprise-grade GCP-based data lakehouse leveraging BigQuery, GCS, Dataproc, Dataflow, Pub/Sub, Cloud Composer, and BigQuery Omni.
  • Define data ingestion, hydration, curation, processing and enrichment strategies for large-scale structured, semi-structured, and unstructured datasets.
  • Create data domain models, canonical models, and consumption-ready datasets for analytics, AI/ML, and operational data products.
  • Design federated data layers and self-service data products for downstream consumers.

Requirements:

  • Architect and maintain GCP data infrastructure using Terraform to ensure consistency across environments (Dev, Stage, Prod).
  • Design enterprise-grade storage and compute layers using BigQuery, GCS, and Dataproc (Lakehouse Architecture).
  • Develop CI/CD pipelines for automated deployment of data resources (Pub/Sub topics, Dataflow jobs, Cloud Composer environments).
  • Build federated data layers and self-service data products enabling downstream consumers to access data products autonomously.

Job description

Description

Our client represents the connected world, offering innovative and customer-centric information technology experiences, enabling Enterprises, Associates, and Society to Rise™.

They are a USD 6 billion company with 163,000+ professionals across 90 countries, helping 1279 global customers, including Fortune 500 companies. They focus on leveraging next-generation technologies, including 5G, Blockchain, Metaverse, Quantum Computing, Cybersecurity, Artificial Intelligence, and more, on enabling end-to-end digital transformation for global customers.

Our client is one of the fastest-growing brands and among the top 7 IT service providers globally. Our client has consistently emerged as a leader in sustainability and is recognized amongst the ‘2021 Global 100 Most sustainable corporations in the World by Corporate Knights. 

We are currently searching for a Data Engineer

Responsibilities

  • Infrastructure as Code (IaC): Architect and maintain GCP data infrastructure using Terraform to ensure consistency across environments (Dev, Stage, Prod).
  • Lakehouse Architecture: Design enterprise-grade storage and compute layers using BigQuery, GCS, and Dataproc.
  • Automated Provisioning: Develop CI/CD pipelines for automated deployment of data resources (Pub/Sub topics, Dataflow jobs, Cloud Composer environments).
  • Data Modeling: Create canonical and domain-specific data models to support AI/ML and operational products.
  • Cross-Cloud Connectivity: Implement BigQuery Omni to enable federated queries across different cloud providers without moving data.
  • Self-Service Enablement: Build federated data layers that allow downstream consumers to access data products autonomously.

Requirements

  • GCP Data Engineer with Terraform, IaC 
  • Architect and design an enterprise grade GCP based data lakehouse leveraging BigQuery, GCS, Dataproc, Dataflow, Pub/Sub, Cloud Composer, and BigQuery Omni.
  • Define data ingestion, hydration, curation, processing and enrichment strategies for large scale structured, semi structured, and unstructured datasets.
  • Create data domain models, canonical models, and consumption ready datasets for analytics, AI/ML, and operational data products.
  • Design federated data layers and self service data products for downstream consumers.

Must

  • Terraform (Mandatory)

Languages

  • Advanced Oral English.
  • Native Spanish.

Note:

  • On-site | Hybrid | Fully remote | Temporarily remote due to COVID-19 | Work-from-home flexibility. <-- Select only one

If you meet these qualifications and are pursuing new challenges, Start your application to join an award-winning employer. Explore all our job openings | Sequoia Career’s Page: https://www.sequoia-connect.com/careers/.


Requirements

  • GCP Data Engineer with Terraform, IaC 
  • Architect and design an enterprise grade GCP based data lakehouse leveraging BigQuery, GCS, Dataproc, Dataflow, Pub/Sub, Cloud Composer, and BigQuery Omni.
  • Define data ingestion, hydration, curation, processing and enrichment strategies for large scale structured, semi structured, and unstructured datasets.
  • Create data domain models, canonical models, and consumption ready datasets for analytics, AI/ML, and operational data products.
  • Design federated data layers and self service data products for downstream consumers.


Data Engineer Related jobs

Other jobs at Sequoia Global Services

We help you get seen. Not ignored.

We help you get seen faster — by the right people.

🚀

Auto-Apply

We apply for you — automatically and instantly.

Save time, skip forms, and stay on top of every opportunity. Because you can't get seen if you're not in the race.

AI Match Feedback

Know your real match before you apply.

Get a detailed AI assessment of your profile against each job posting. Because getting seen starts with passing the filters.

Upgrade to Premium. Apply smarter and get noticed.

Upgrade to Premium

Join thousands of professionals who got noticed and hired faster.