Sr Data Engineer

Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

Studies in computer science, engineering, or a related field., 5+ years of hands-on experience in data engineering, with at least 2 years working with BigQuery or Snowflake., Strong programming skills in Python for data processing and automation., Advanced proficiency in SQL for querying and transforming large datasets..

Key responsibilities:

  • Contribute to the design and architecture of a lakehouse solution using technologies like Iceberg, Snowflake, and BigQuery.
  • Build and maintain robust ETL/ELT workflows using Python and SQL for structured and semi-structured data.
  • Partner with data scientists and analysts to provide high-quality, accessible, and well-structured data.
  • Monitor, troubleshoot, and improve the performance of data systems and pipelines.

Blend360 logo
Blend360 Professional Services Scaleup https://www.blend360.com/
501 - 1000 Employees
See all jobs

Job description

Company Description

Blend is a premier AI services provider, committed to co-creating meaningful impact for its clients through the power of data science, AI, technology, and people. With a mission to fuel bold visions, Blend tackles significant challenges by seamlessly aligning human expertise with artificial intelligence. The company is dedicated to unlocking value and fostering innovation for its clients by harnessing world-class people and data-driven strategy. We believe that the power of people and AI can have a meaningful impact on your world, creating more fulfilling work and projects for our people and clients. For more information, visit www.blend360.com

Job Description

We are looking for an experienced Senior Data Engineer with a strong foundation in Python, SQL, and Snowflake, and hands-on expertise in BigQuery, Databricks. In this role, you will build and maintain scalable data pipelines and architecture to support analytics, data science, and business intelligence initiatives. You’ll work closely with cross-functional teams to drive data reliability, quality, and performance.
Responsibilities:

  • Contribute to the design and architecture of a lakehouse solution, potentially leveraging technologies such as IcebergSnowflake, and BigQuery.
  • Build and maintain robust ETL/ELT workflows using Python and SQL to handle structured and semi-structured data.
  • Partner with data scientists and analysts to provide high-quality, accessible, and well-structured data.
  • Ensure data quality, governance, security, and compliance across pipelines and data stores.
  • Monitor, troubleshoot, and improve the performance of data systems and pipelines.
  • Participate in code reviews and help establish engineering best practices.
  • Mentor junior data engineers and support their technical development.

Qualifications
  • Studies in computer science, Engineering, or a related field.
  • 5+ years of hands-on experience in data engineering, with at least 2 years working with Bigquery or Snowflake.
  • Strong programming skills in Python for data processing and automation.
  • Advanced proficiency in SQL for querying and transforming large datasets.
  • Solid understanding of data modelling, warehousing, and performance optimization techniques.
  • Proven experience in data cataloging and inventorying large-scale datasets.
  • Hands-on experience implementing and working with Medallion architecture in data lakehouse environments
  • Iceberg, Data Mesh experience, dbt (Those are a Plus)

Required profile

Experience

Industry :
Professional Services
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Mentorship

Data Engineer Related jobs