About Dr. Berg Nutritionals
Dr. Berg Nutritionals is one of the largest health education and supplement companies in the world, built around Dr. Eric Berg's YouTube channel (approximately 15 million subscribers, 7,000+ videos, and 111 million weekly views). The business generates $173M+ in annual revenue across Amazon, Shopify, Walmart, and TikTok Shop, with a rapidly growing subscription base.
We are a founder-led, Clearwater-based company with a lean engineering culture. Most of our custom infrastructure runs on Azure (App Services, PostgreSQL, Blob Storage) and is written in C# .NET, including our production AI agents built on Microsoft Agent Framework (Semantic Kernel and AutoGen). We use Microsoft 365 (Teams, SharePoint, Planner, Forms, Power Automate) for collaboration and Trello for some workflows.
Role Summary
We are building a unified data warehouse and AI intelligence system that will give our CEO and leadership team a single, trustworthy view of the business — financial performance by SKU and channel, customer lifetime value by acquisition source, video-to-commerce attribution across 7,000+ pieces of content, and the operational nervous system of the company.
The Senior Data Engineer is the foundational hire for this system. You will own the pipelines that move data from every source (Amazon SP-API, Shopify, NetSuite, Klaviyo, Recharge, YouTube, GA4, and others) into our warehouse reliably, accurately, and on time. Downstream of your work sit our analytics models, our AI strategic analyzer, and ultimately the weekly CEO Strategic Brief that drives decision-making across the company.
This is not a greenfield research role. We have a live business with real revenue that depends on data being right. If a pipeline breaks at 2am on Thursday, the CEO's Friday morning brief is wrong, and the decisions made from it are wrong. We are looking for someone who takes that responsibility seriously and has the craft to do it well.
What You'll Do
In Your First 90 Days
Ongoing Core Responsibilities
Build and maintain ingestion pipelines. You will own the end-to-end pipelines from source systems into our warehouse. This includes Amazon Selling Partner API, Shopify Admin API, NetSuite (SuiteAnalytics Connect), Klaviyo, Recharge, YouTube Data and Analytics APIs, GA4 (via BigQuery export), Google Ads, Meta Ads, Triple Whale, and approximately 15 additional sources across our Layer 1–5 data model. For each pipeline you will design the ingestion approach, build it with proper error handling and idempotency, establish incremental-load patterns where appropriate, and monitor it in production.
Own orchestration and scheduling. You decide what runs when, in what order, and with what dependencies. Financial data needs to be fresh before finance’s morning reconciliation. YouTube analytics need to respect daily API quotas across 7,000+ videos. Klaviyo events need to stream continuously. This is your call to make — and your responsibility to get right.
Monitoring, alerting, and on-call. Every pipeline you build needs health checks: row counts within expected ranges, schema validation, freshness SLAs, and data quality gates. You will configure Azure Monitor alerts, decide what pages someone overnight versus what can wait, and lead post-incident reviews. You will take part in a one-in-four weekly on-call rotation once the team is fully staffed.
Performance and cost optimization. Our data volumes are substantial — YouTube analytics alone is 7,000+ videos × daily metrics × multiple channels. You will own partitioning strategy, query tuning, incremental processing patterns, and monthly cost reviews. At our scale, this work directly saves tens of thousands of dollars per year in warehouse compute.
Source system and vendor API management. When Shopify deprecates an endpoint, when Amazon changes reporting structure, when NetSuite releases a new ODBC driver — you're the person who reads the release notes, tests the change, and adapts the pipelines. You will own API keys, service accounts, rate-limit tracking, and vendor support escalations for data-source APIs.
Enforce data contracts. You define and enforce the contracts between source systems and downstream consumers — what fields exist, what's never null, what ranges are valid. When a source system violates its contract, your pipelines stop and alert rather than passing bad data downstream to our AI analyzer. This is what structurally prevents hallucinations.
Infrastructure-as-code and CI/CD. Pipelines are defined as code (Bicep, Terraform, or ARM templates) and deployed through peer review. You will own this practice, along with the dev/staging/production environment separation that lets us move fast without breaking the weekly brief.
What You Won't Do
It's worth being explicit about role boundaries so you know where your work ends and where teammates take over:
Qualifications, Experience & Skills
Required
Preferred
How We'll Know You're the Right Fit
Traits we're specifically looking for, beyond technical ability:
Work from Home Requirements
Compensation & Benefits
We are intentional about creating a work environment that supports both high-quality work and the people doing it.
Pay Range: Competitive, based on experience.
For employee (W-2) engagements: We offer a competitive benefits package and performance-based bonuses tied to outcomes.
Hours of Work: Must be available for communication during business hours, Monday – Friday, 9am – 6pm EST
Location: Fully remote; not travel to the Clearwater office is expected.
Engagement Type: Employee (W-2) full-time salaried, exempt OR Independent Contractor.
Dr. Berg Nutritionals is an equal-opportunity employer. We welcome applicants from all backgrounds and do not discriminate on the basis of race, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, veteran status, or any other protected characteristic.

Allbirds

SecurityScorecard

Gusto

NPS Prism

Sentara Healthcare

Dr. Berg Nutritionals, Inc

Dr. Berg Nutritionals, Inc