Logo for Prescryptive Health, Inc.

Principal Data Engineer

Roles & Responsibilities

  • 8+ years of experience in data engineering, data warehousing, or cloud data architecture.
  • 3+ years hands-on Snowflake experience including architecture, performance tuning, and security.
  • Experience building secure patient/prescription data marts with HIPAA and Safe Harbor compliance.
  • Proven experience building ETL/ELT pipelines using Azure Data Factory, Airflow / Dagster, Airbyte, and dbt with strong SQL and Python skills.

Requirements:

  • Lead the implementation of Snowflake solutions (environments, schema design, warehouses, RBAC, ingestion and transformation pipelines).
  • Build scalable ETL/ELT pipelines using ADF, Airflow/Dagster, Airbyte, and dbt, applying medallion architecture to ensure data quality, lineage, and accessibility.
  • Own end-to-end data lifecycle from ingestion and transformation to reporting and governance; collaborate with product, data science, and analytics to define data models for dashboards, experimentation, and AI/ML.
  • Create and manage Power BI reports/dashboards; drive performance tuning, cost optimization, security enforcement in Snowflake; implement data quality, monitoring, governance, CI/CD, and mentoring.

Job description

Who is Prescryptive?

Prescryptive is the healthcare technology company enabling the direct access marketplace for prescription drugs. Our platform aligns incentives so affordability, choice, and patient access become the natural outcome of a functioning system. Learn more about us by following us on LinkedIn or visiting Prescryptive.com.  

About this role

As a Principal Data Engineer, you will play a key role in developing our modern data infrastructure and play a strategic role in building a cutting-edge analytical data and AI platform. You will be responsible for the architecture, design, and implementation of secure, scalable, high-performance ETL \ ELT data pipelines and Enterprise data models using Snowflake, helping power intelligent, data-driven products and customer experiences.

As a senior technical expert, you will collaborate cross-functionally with engineering, analytics, and product teams to design data systems that are secure, efficient, and cost-effective. You will also mentor junior engineers and help define engineering standards and best practices across our data ecosystem. 

What you will do

  • Lead the implementation of Snowflake solutions, including setting up Snowflake environments, schema design, warehouse setup, RBAC and ingestion and transformation data pipelines.
  • Build scalable and maintainable ETL/ELT pipelines using tools and frameworks such as Azure Data Factory (ADF), Airflow / Dagster, and dbt, with SQL and Python.
  • Design and implement robust data integration pipelines to ingest, process, and unify structured and unstructured data from a variety of sources—including APIs, cloud storage systems, External websites, and relational/non-relational databases—using a modern medallion architecture (bronze, silver, gold layers) to ensure data quality, lineage, and accessibility across the organization.  
  • Lead efforts in migrating legacy data systems to a modern cloud-based stack centered on Snowflake.
  • Own the end-to-end data lifecycle, from ingestion and transformation to reporting and governance.
  • Collaborate with product, data science, and analytics teams to define data models that support analytical dashboards, reporting, experimentation, and AI/ML initiatives.
  • Create and manage reports and dashboards using Power BI to provide actionable insights to the team and leadership.
  • Drive performance tuning, cost optimization, and security enforcement within the Snowflake platform.
  • Utilize SQL and Python for ETL, data manipulation, analysis, and process automation.
  • Implement data quality, monitoring, and governance frameworks across the data pipeline.
  • Ensure data security, privacy, and compliance with relevant regulations (e.g., GDPR, CCPA).
  • Implement and manage Snowflake's security features, such as access controls, encryption, and data masking.
  • Drive initiatives to automate and improve data operations, including testing, CI/CD deployment, and documentation.
  • Provide technical leadership and mentorship to a growing team of data engineers.
  • Participate in long-term strategic planning, providing technical insights and recommendations to support business growth. 

What you will bring

  • 8+ years of experience in data engineering, data warehousing, or cloud data architecture.
  • 3+ years of hands-on experience with Snowflake, including architecture, performance tuning, and security.
  • Expertise in Snowflake performance tuning, warehouse sizing, and cost optimization.
  • Experience in building secure patient and prescription data marts with strong adherence to HIPAA and Safe Harbor compliance requirements.
  • Advanced proficiency in SQL and data modeling (star, snowflake, 3NF, and/or Data Vault).
  • Proven experience building ETL/ELT Orchestration tools like Azure Data Factory, Airflow / Dagster, Airbyte and dbt.
  • Strong understanding of data warehousing concepts, data modeling techniques (Kimball, Data Vault 2.0), and architecture best practices.
  • Experience with cloud platforms such as Azure, AWS or GCP (especially Snowflake integrations, storage and data services).
  • Proficiency in SQL and Python or another scripting language used in data engineering.
  • Understanding and hands on experience in enabling data privacy, governance, and compliance frameworks (HIPAA, GDPR, etc.). 
  • Experience with Data orchestration tools. 
  • Expertise in setting up CI/CD pipelines and DevOps for data projects (e.g. GIT, Terraform, etc.). 
  • Skilled in cost-effective cloud resource allocation and management. 
  • Strong testing capabilities (unit, integration, end to end).   
  • Strong understanding of OLTP Vs OLAP.
  • 5+ years of experience with Power BI for data visualization and reporting.

Ideally you will also have

  • Experience with Machine Learning approaches is preferred.
  • Snowflake SnowPro Certification.
  • Familiarity with real-time data streaming tools such as Kafka is a plus.
  • Experience with Claude Code or similar AI-assisted coding tools is a plus.
  • Experience with MongoDB, FHIR data standards, and integrating Salesforce data into Snowflake Data Warehouse is a strong plus.

What we have to offer

  • The opportunity to grow alongside an early-stage company shaking up a big, old-fashioned industry
  • Flexible time off, including 12 paid holidays
  • 401k match plus 100% employer paid medical, dental, and vision premiums
  • Company contribution to Health Savings Account
  • Stock options

Prescryptive is committed to fair pay practices. The projected annual salary for this position is $177,000 to $215,000.   When preparing an offer, we consider the candidates resume, experience, interview feedback, internal equity, and location.

Prescryptive is an Equal Opportunity Employer. Prescryptive does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.

Data Engineer Related jobs

Other jobs at Prescryptive Health, Inc.

We help you get seen. Not ignored.

We help you get seen faster — by the right people.

🚀

Auto-Apply

We apply for you — automatically and instantly.

Save time, skip forms, and stay on top of every opportunity. Because you can't get seen if you're not in the race.

AI Match Feedback

Know your real match before you apply.

Get a detailed AI assessment of your profile against each job posting. Because getting seen starts with passing the filters.

Upgrade to Premium. Apply smarter and get noticed.

Upgrade to Premium

Join thousands of professionals who got noticed and hired faster.