Data Engineer

extra holidays - extra parental leave
Work set-up: 
Full Remote
Contract: 
Experience: 
Mid-level (2-5 years)
Work from: 

Offer summary

Qualifications:

3-5+ years of experience in data engineering and data architecture., Proficiency in SQL, dbt, and Python., Experience with building and optimizing data pipelines and models., Knowledge of data modeling, automation, and monitoring best practices..

Key responsibilities:

  • Design and build data pipelines from multiple sources.
  • Develop data models for product analytics and revenue tracking.
  • Automate data workflows and establish observability.
  • Communicate technical solutions to non-technical stakeholders.

Inclusion Cloud logo
Inclusion Cloud Information Technology & Services SME https://inclusioncloud.com/
201 - 500 Employees
See all jobs

Job description

In this role, you will:

  • Perform Product data engineering in partnership with Software Engineering to build new Product features and enhancements.
  • Build Product and Customer Analytic Data Architecture: Build the data infrastructure necessary to analyze & support Product strategy.
  • Build and Optimize Data Pipelines: Design, build, and deploy data pipelines from multiple source systems; refactor existing pipelines for performance.
  • Perform Data Modeling: Implement data modeling best practices in dbt.
  • Automate and Monitor Data Processes: Document data workflows, automate pipelines, and establish observability and monitoring to ensure availability of analytic & Product data.
  • Present and Communicate Technical Solutions: Clearly communicate technical topics and data solutions to non-technical stakeholders to achieve buy-in.

Required Skills

  • 3-5+ years of experience in data engineering and data architecture.
  • Experience working with Software Engineering and DevOps as part of the Product development lifecycle.
  • Demonstrated experience building data models for capturing software application telemetry and website traffic for Product usage & site visitor analytics.
  • Demonstrated experience building data models for capturing billing and subscriber transactions to support revenue and retention analytics.
  • Expertise in SQL, dbt, Python, batch & streaming processing, change data capture, orchestration, scheduling, and data migration best practices - experience with Postgres, Snowflake, Airflow, and Dagster preferred.
  • Hands-on expertise designing data quality & pipeline observation, monitoring, and alerting best practices.
  • Knowledge of or experience with advertising/adtech business models preferred.
  • Ability to integrate analytic data pipelines into Business Intelligence tools - Looker preferred.
  • Knowledge of or experience with GitHub for code management.

Required profile

Experience

Level of experience: Mid-level (2-5 years)
Industry :
Information Technology & Services
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Communication

Data Engineer Related jobs