Extensive experience in data development with expert-level SQL and programming skills, preferably in Python., Demonstrated proficiency with modern data tools such as Snowflake, DBT, and workflow automation tools like Airflow., Experience in implementing structured data models and collaborating with data analysts and scientists., Ability to navigate ambiguous problems and shape the role as the first Data Engineer in the organization..
Key responsabilities:
Own and maintain the Snowflake data warehouse and DBT models, ensuring access to clean and accurate data.
Build scalable data pipelines and models to support various teams in making data-driven decisions.
Collaborate with cross-functional teams to translate complex needs into actionable data architectures.
Establish best practices for data quality and reliability, driving innovation in data solutions.
Report This Job
Help us maintain the quality of our job listings. If you find any issues with this job post, please let us know.
Select the reason you're reporting this job:
Honeycomb provides full stack observability—designed for high cardinality data and collaborative problem solving, enabling engineers to deeply understand and debug production software together. Founded on the experience of debugging problems at the scale of millions of apps serving tens of millions of users, we empower every engineer to instrument and query the behavior of their system.
Honeycomb is the observability platform for teams who manage software that matters. Send any data to our one-of-a-kind data store, solve problems with all the relevant context, and fix issues before your customers find them. Honeycomb is the unified, fast, and collaborative choice for engineering teams who care about customer experience to get the answers they need, quickly. We are passionate about consumer-quality developer tools and excited to build technology that raises our industry’s expectations of what our tools can do for us. We’re working with well known companies like HelloFresh, Slack, LaunchDarkly, and Vanguard and more across a range of industries. This is an exciting time in our trajectory, as we’ve closed Series D funding, scaled past the 200-person mark, and were named to Forbes’ America’s Best Startups of 2022 and 2023! If you want to see what we’ve been up to, please check out these blog posts and Honeycomb.iopress releases.
Who We Are
We come for the impact, and stay for the culture! We’re a talented, opinionated, passionate, fiercely inclusive, and responsible group of bees. We have conviction and we strive to live our values every day. We want our people to do what they truly love amongst a team of highly talented (but humble) peers.
How We Work
We are a remote-first company, which means we believe it is not where you sit, but how you deliver that matters most. We invest in our people and care about how you orient to our culture and processes. At the same time we imbue a lot of trust, autonomy, and accountability from Day 1. #LI-Remote
The Role
As our very first Senior Data Engineer, you’ll have a unique opportunity to lay the foundation for Honeycomb’s data-driven future. Partnering directly with the Head of Data, you will architect and build a modern, scalable data platform that not only powers our business-critical insights but also sets the standard for data quality and reliability across the organization.
What You’ll Do in the Role:
Own the Data Platform: Take full ownership of our Snowflake data warehouse, DBT models, and diverse ingestion platform. You’ll design and maintain end-to-end solutions that enable access to clean, accurate and well-annotated data.
Build Scalable Systems: Leverage modern technologies to create robust, production-grade data pipelines and models. Your work will enable rapid iteration and empower teams from R&D to Sales, Marketing, Finance, and beyond to make informed, data-driven decisions and have ownership over their data.
Collaborate Across Functions: Work hand-in-hand with engineering, product, sales, marketing, and business stakeholders to translate complex needs into aligned data architectures and actionable insights. Your collaborative spirit will help bridge gaps and foster a culture of shared success.
Drive Innovation and Quality: Establish best practices for data quality and reliability by setting meaningful SLO metrics and continuously refining our systems. You’ll have the autonomy to experiment with new technologies and approaches, driving innovation in a fast-paced, evolving environment.
Lead with Impact: From planning and deployment to long-term maintenance, you’ll lead critical projects with a keen sense of ownership and strategic vision. Your ability to balance technical excellence with business value will be key to our next phase of growth.
If you are a seasoned data professional with a passion for creating scalable, robust data solutions and enjoy solving complex problems through innovative thinking, we’d love to have you help shape the future of Honeycomb. Join us, and be at the forefront of transforming our data capabilities while making a lasting impact across the entire organization.
What You'll Bring:
Extensive data development including expert-level SQL and programming experience in a scripting language (preferably Python)
Demonstrated experience with modern data tooling including: MPP Data warehouses (e.g. Redshift or Snowflake (preferred)), DBT Workflow automation (e.g. Airflow, Dagster, Prefect)
Experience implementing structured data models, architectures and marts (e.g. Inmon, Kimball)
Experience collaborating with data analysts, data scientists and business users with varying levels of data savvy
Comfortable working through ambiguous problems - this is our first DE hire so there will be a fair amount of role shaping
Bonus / Preferred experience:
Experience with any of the following: Spark, Scala, Terraform, AWS/K8s, Debezium/Flink
Experience managing production-grade data pipelines powering customer-facing applications
Exposure to MLOps and supporting ML/AI team’s data requirements
Experience working with CRM, Martech and other GTM datasets and systems
What You Get When You Join the Hive
Base pay (range) of $170,000 - $200,000 USD, CAD $233,504 - CAD $274,710
A stake in our success - generous equity with employee-friendly stock program
It’s not about how strong of a negotiator you are - our pay is based on transparent levels relative to experience
Time to recharge - Unlimited PTO and paid sabbatical
A remote-first mindset and culture (really!)
Home office, co-working, and internet stipend
100% employee/75% for dependents coverage for all benefits
Up to 16 weeks of paid parental leave, regardless of path to parenthood
Annual development allowance
And much more...
Please note we cannot currently sponsor or do visa transfers at this time.
Diversity & Accommodations:
We're building a diverse and inclusive workplace where we learn from each other, and welcome nontraditional candidates, and people of all backgrounds, experiences, abilities and perspectives. You don't need to be a millennial to join us, all gens are welcome! Further, we (of course) follow federal and state disability laws and are happy to provide reasonable accommodations during the application phase, interview process, and employment. Please email Talent@honeycomb.io to discuss accessible formats or accommodations. As an equal opportunity employer our hiring process is designed to put you at ease and help you show your best work; if we can do better - we want to know!
Required profile
Experience
Spoken language(s):
English
Check out the description to know which languages are mandatory.