Senior Data Engineer

Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

Bachelor's or Master's degree in Computer Science, Information Technology, Engineering, or related field., Minimum 7+ years of data engineering experience with 5+ years on GCP., Proven expertise in GCP services like BigQuery, Dataflow, Cloud Composer, and Cloud Functions., Strong skills in Python, SQL, Apache Spark, and Apache Airflow..

Key responsibilities:

  • Design and optimize scalable ELT/ETL pipelines for structured and unstructured data.
  • Build cloud-native data workflows using GCP services.
  • Develop high-throughput Spark workloads and parameterized DAGs in Airflow.
  • Collaborate with cross-functional teams and communicate technical solutions.

Dentsu Media logo
Dentsu Media Marketing & Advertising XLarge https://www.dentsu.com/
10001 Employees
See all jobs

Job description

Job Description:

Job Title: Lead GCP Data Engineer (Senior Level)

Reports to: SVP, Head of Data, Technology & Analytics
Location: Remote – Global (must be available through 2 p.m. U.S. Eastern Time)
Employment Type: Full-time • Long-term Contract (Annual Renewal)

Key Responsibilities

Data Engineering & Development

  • Design, build, and optimize scalable ELT/ETL pipelines to process structured and unstructured data across batch and streaming systems.
  • Architect and deploy cloud-native data workflows using GCP services including BigQuery, Cloud Storage, Cloud Functions, Cloud Pub/Sub, Dataflow, and Cloud Composer.
  • Build high-throughput Apache Spark workloads in Python and SQL, with performance tuning for scale and cost.
  • Develop parameterized DAGs in Apache Airflow with retry logic, alerting, SLA/SLO enforcement, and robust monitoring.
  • Build reusable frameworks for high-volume API ingestion, transforming Postman collections into production-ready Python modules.
  • Translate business and product requirements into scalable, efficient data systems that are reliable and secure.

Cloud Infrastructure & Security

  • Implement IAM and VPC-based security to manage and deploy GCP infrastructure for secure data operations.
  • Ensure robustness, scalability, and cost-efficiency of all infrastructure, following FinOps best practices.
  • Apply automation through CI/CD pipelines using tools like Git, Jenkins, or Bitbucket.

Data Quality, Governance & Optimization

  • Design and implement data quality frameworks, monitoring, validation, and anomaly detection.
  • Build observability dashboards to ensure pipeline health and proactively address issues.
  • Ensure compliance with data governance policies, privacy regulations, and security standards.

Collaboration & Project Delivery

  • Work closely with cross-functional stakeholders including data scientists, analysts, DevOps, product managers, and business teams.
  • Effectively communicate technical solutions to non-technical stakeholders.
  • Manage multiple concurrent projects, shifting priorities quickly and delivering under tight timelines.
  • Collaborate within a globally distributed team with real-time engagement through 2 p.m. U.S. Eastern Time.

Qualifications & Certifications

Education

  • Bachelor’s or Master’s degree in Computer Science, Information Technology, Engineering, or a related field.

Experience

  • Minimum 7+ years in data engineering with 5+ years of hands-on experience on GCP.
  • Proven track record with tools and services like BigQuery, Cloud Composer (Apache Airflow), Cloud Functions, Pub/Sub, Cloud Storage, Dataflow, and IAM/VPC.
  • Demonstrated expertise in Apache Spark (batch and streaming), PySpark, and building scalable API integrations.
  • Advanced Airflow skills including custom operators, dynamic DAGs, and workflow performance tuning.

Certifications

  • Google Cloud Professional Data Engineer certification preferred.

Key Skills

Mandatory Technical Skills

  • Advanced Python (PySpark, Pandas, pytest) for automation and data pipelines.
  • Strong SQL with experience in window functions, CTEs, partitioning, and optimization.
  • Proficiency in GCP services including BigQuery, Dataflow, Cloud Composer, Cloud Functions, and Cloud Storage.
  • Hands-on with Apache Airflow, including dynamic DAGs, retries, and SLA enforcement.
  • Expertise in API data ingestion, Postman collections, and REST/GraphQL integration workflows.
  • Familiarity with CI/CD workflows using Git, Jenkins, or Bitbucket.
  • Experience with infrastructure security and governance using IAM and VPC.

Nice-to-Have Skills

  • Experience with Terraform or Kubernetes (GKE).
  • Familiarity with data visualization tools such as Looker or Tableau.
  • Exposure to MarTech/AdTech data sources and campaign analytics.
  • Knowledge of machine learning workflows and their integration with data pipelines.
  • Experience with other cloud platforms like AWS or Azure.

Soft Skills

  • Strong problem-solving and critical-thinking abilities.
  • Excellent verbal and written communication skills to engage technical and non-technical stakeholders.
  • Proactive and adaptable, with a continuous learning mindset.
  • Ability to work independently as well as within a collaborative, distributed team.

Working Hours

  • Must be available for real-time collaboration with U.S. stakeholders every business day through 2 p.m. U.S. Eastern Time (minimum 4-hour overlap).

Location:

DGS India - Bengaluru - Manyata H2 block

Brand:

Merkle

Time Type:

Full time

Contract Type:

Permanent

Required profile

Experience

Industry :
Marketing & Advertising
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Adaptability
  • Collaboration
  • Communication
  • Problem Solving

Data Engineer Related jobs