Match score not available

Senior Data Engineer (Remote)

extra holidays - extra parental leave
Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

5+ years of experience in data engineering, Expertise in SQL and Python for data processing, Proficiency with SQL-based warehouses like BigQuery or Snowflake, Experience with data processing frameworks such as Apache Spark..

Key responsabilities:

  • Owning projects from inception to completion
  • Orchestrating data pipelines using tools like Apache Airflow
  • Collaborating with stakeholders to convert requirements into technical specifications
  • Participating in an on-call support rotation.

AllTrails logo
AllTrails Information Technology & Services SME https://www.alltrails.com/
51 - 200 Employees
See all jobs

Job description

About AllTrails

AllTrails is the most trusted and used outdoors platform in the world. We help people explore the outdoors with hand-curated trail maps along with photos, reviews, and user recordings crowdsourced from our community of millions of registered hikers, mountain bikers, and trail runners in 150 countries. AllTrails is frequently ranked as a top-5 Health and Fitness app and has been downloaded by over 75 million people worldwide.

Every day, we solve incredibly hard problems so that we can get more people outside having healthy, authentic experiences and a deeper appreciation of the outdoors. Join us!  

This is a U.S.-based remote position. San Francisco Bay Area employees are highly encouraged to come into the office one day a week.

What You’ll Be Doing:
  • 5+ years of experience working in data engineering
  • Expertise in using both SQL and Python for data cleansing, transformation, modeling, pipelining, etc.
  • Proficiency in working with high volume datasets in SQL-based warehouses such as BigQuery, Redshift, Snowflake, or others, preferably using ELT tools like Dataform or dbt
  • Experience with parallelized data processing frameworks such as Google Dataflow, Apache Spark, etc.
  • Proficiency in working with other stakeholders and converting requirements into detailed technical specifications; owning projects from inception to completion
  • Deep understanding of data modeling, access, storage, caching, replication, and optimization techniques
  • Ability to orchestrate data pipelines through tools such as Apache Airflow
  • Experienced in container and pod orchestration (e.g. Docker, Kubernetes)Understanding of the software development lifecycle and CI/CD
  • Monitoring and metrics-gathering (e.g. Datadog, NewRelic, Cloudwatch, etc)
  • Proficiency with git and working collaboratively in a shared codebase
  • Willingness to participate in an on-call support rotation - currently the rotation is monthly
  • Self motivation and a deep sense of pride in your workPassion for the outdoors
  • Comfort with ambiguity, and an instinct for moving quickly
  • Humility, empathy and open-mindedness - no egos

  • Requirements:
  • Proficient in working with other stakeholders and converting requirements into detailed technical specifications; owning projects from inception to completion
  • Expertise both in using SQL and Python for data cleansing, transformation, modeling, pipelining, etc.
  • Proficiency in working with high volume datasets in SQL-based warehouses such as BigQuery, Redshift, Snowflake, or others, preferably using ELT tools like Dataform or dbt
  • Experience with parallelized data processing frameworks such as Google Dataflow, Apache Spark, etc.
  • Deep understanding of data modeling, access, storage, caching, replication, and optimization techniques
  • Ability to orchestrate data pipelines through tools such as Apache Airflow
  • Experienced in container and pod orchestration (e.g. Docker, Kubernetes)
  • Understanding of the software development lifecycle and CI/CD
  • Monitoring and metrics-gathering (e.g. Datadog, NewRelic, Cloudwatch, etc)
  • Willingness to participate in an on-call support rotation - currently the rotation is monthly
  • Proficiency with git and working collaboratively in a shared codebase
  • Excellent documentation skills
  • Self motivation and a deep sense of pride in your work
  • Passion for the outdoors
  • Comfort with ambiguity, and an instinct for moving quickly
  • Humility, empathy and open-mindedness - no egos

  • Bonus Points:
  • Experience working in a multi-cloud environment
  • Experience working with a data stack in Google Cloud Platform
  • Experience with Amplitude
  • Experience with infrastructure-as-code, such as Terraform
  • Experience with machine learning frameworks and platforms such as VertexAI, SageMaker, MLFlow, or related frameworks

  • What We Offer:
  • A competitive and equitable compensation plan. This is a full-time, salaried position that includes equity
  • Physical & mental well-being including health, dental and vision benefits
  • Trail Days: No meetings first Friday of each month to go test the app and explore new trails!
  • Unlimited PTO
  • Flexible parental leave 
  • Remote employee equipment stipend to create a great remote work environment
  • Annual continuing education stipend
  • Discounts on subscription and merchandise for you and your friends & family
  • An authentic investment in you as a human being and your career as a professional
  • Nature celebrates you just the way you are and so do we! At AllTrails we’re passionate about nurturing an inclusive workplace that values diversity. It’s no secret that companies that are diverse in background, age, gender identity, race, sexual orientation, physical or mental ability, ethnicity, and perspective are proven to be more successful. We’re focused on creating an environment where everyone can do their best work and thrive.

    AllTrails participates in the E-Verify program for all remote locations.
    By submitting my application, I acknowledge and agree to AllTrails' Job Applicant Privacy Notice.

    Required profile

    Experience

    Industry :
    Information Technology & Services
    Spoken language(s):
    English
    Check out the description to know which languages are mandatory.

    Other Skills

    • Humility
    • Communication
    • Open Mindset
    • Empathy
    • Self-Motivation

    Data Engineer Related jobs