Match score not available

Data Engineer

Remote: 
Full Remote
Contract: 
Experience: 
Mid-level (2-5 years)
Work from: 

Offer summary

Qualifications:

Expertise in Google Cloud Platform (GCP), Deep experience with data warehousing and modeling, Hands-on experience with DBT and Python, Advanced proficiency in SQL for data manipulation, Experience with analytics tools like Looker and Tableau.

Key responsabilities:

  • Design, build, and maintain scalable data pipelines using GCP services
  • Develop and optimize SQL queries for ETL processes
  • Manage ETL workflows ensuring data integrity with DBT
  • Perform data modeling and create robust solutions for large datasets
  • Collaborate with teams to translate business requirements into technical solutions
Remote Choice logo
Remote Choice
2 - 10 Employees
See more Remote Choice offers

Job description

This is a remote position.

Job Description:
We are seeking a highly skilled and motivated Data Engineer with expertise in Google Cloud Platform (GCP), BigQuery, and data engineering best practices. The ideal candidate will have deep experience with data warehousing, data modeling, ETL processes, and hands-on experience with DBT and Python. This role will play a pivotal part in architecting and building data solutions that enable data-driven decision-making for the business.


Requirements

Key Responsibilities:

  • Design, build, and maintain scalable data pipelines and architectures using GCP services, including BigQuery, Airflow (Composer), and other tools.
  • Develop and optimize SQL queries and scripts for data extraction, transformation, and loading (ETL).
  • Implement and manage ETL workflows with DBT, ensuring data integrity and efficiency.
  • Perform data modeling and warehousing tasks, creating robust solutions to handle large datasets.
  • Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions.
  • Work with analytics and visualization tools (Looker, Tableau, ThoughtSpot) to ensure accurate data representation for business users.
  • Write clean, maintainable, and efficient Python code for data processing tasks.
  • Troubleshoot, debug, and optimize data pipelines and queries to ensure high performance.
  • Ensure data quality, governance, and security across the organization’s datasets.

Required Skills and Experience:

  • Extensive experience with Google Cloud Platform (GCP), particularly BigQuery and Composer (Airflow).
  • Advanced proficiency in SQL for data manipulation and transformation.
  • Proven experience in data engineering, data modeling, and data warehousing.
  • Strong expertise in Python for data-related tasks.
  • Hands-on experience with DBT for building ETL workflows.
  • Experience with analytics and front-end tools such as Looker, Tableau, ThoughtSpot, or similar.
  • Strong problem-solving skills and the ability to work independently.
  • Excellent communication skills, with the ability to convey complex technical solutions to non-technical stakeholders.


Required profile

Experience

Level of experience: Mid-level (2-5 years)
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Problem Solving
  • Verbal Communication Skills

Data Engineer Related jobs