At Parexel, we all share the same goal - to improve the world's health. From clinical trials to regulatory, consulting, and market access, every clinical development solution we provide is underpinned by something special - a deep conviction in what we do.
Each of us, no matter what we do at Parexel, contributes to the development of a therapy that ultimately will benefit a patient. We take our work personally, we do it with empathy and we're committed to making a difference.
Parexel is seeking a highly experienced Senior Software Engineer to architect, develop, and optimize enterprise-grade data pipelines and platforms using Azure, Databricks, Snowflake, Denodo, and Power BI. This role is pivotal in transforming raw data into actionable insights and building a resilient, scalable data ecosystem that supports business-critical functions across clinical and operational domains.
Key Responsibilities:
Architect and implement end-to-end data pipelines using Azure Data Factory, Databricks, and Snowflake for large-scale data ingestion, transformation, and storage.
Design and maintain virtualization layers using Denodo to ensure seamless data access across varied source systems.
Collaborate with BI teams to ensure data models are optimized for reporting in Power BI, with a focus on performance and usability.
Establish data governance, quality, and security controls, ensuring compliance with GDPR, HIPAA, and global clinical data regulations.
Lead cost optimization and performance tuning initiatives for cloud data workflows across Azure and Snowflake.
Mentor and guide junior engineers, fostering technical excellence and knowledge sharing.
Drive automation and CI/CD practices within data engineering pipelines, integrating with version control and deployment workflows.
Work closely with Data Architects, Business Analysts, and Product Owners to translate business needs into technical solutions.
Required Qualifications:
Experience: 7+ years of data engineering experience, with at least 4 years hands-on in Azure, Databricks, and Snowflake; experience with Denodo and Power BI integration is highly desirable.
Education: Bachelor’s or master’s degree in computer science, Information Systems, Engineering, or a related field.
Skills:
Expert-level knowledge of Azure Data Factory, Databricks, and Snowflake.
Strong understanding of data virtualization concepts (Denodo experience preferred).
Proven experience in building BI-ready datasets and performance tuning in Power BI.
Proficient in SQL, Python, and cloud-native architecture.
Strong grasp of data security, privacy compliance, and best practices in a regulated environment.
When our values align, there's no limit to what we can achieve.
At Parexel, we all share the same goal - to improve the world's health. From clinical trials to regulatory, consulting, and market access, every clinical development solution we provide is underpinned by something special - a deep conviction in what we do.
Each of us, no matter what we do at Parexel, contributes to the development of a therapy that ultimately will benefit a patient. We take our work personally, we do it with empathy and we're committed to making a difference.
Parexel is seeking a highly experienced Senior Software Engineer to architect, develop, and optimize enterprise-grade data pipelines and platforms using Azure, Databricks, Snowflake, Denodo, and Power BI. This role is pivotal in transforming raw data into actionable insights and building a resilient, scalable data ecosystem that supports business-critical functions across clinical and operational domains.
Key Responsibilities:
Architect and implement end-to-end data pipelines using Azure Data Factory, Databricks, and Snowflake for large-scale data ingestion, transformation, and storage.
Design and maintain virtualization layers using Denodo to ensure seamless data access across varied source systems.
Collaborate with BI teams to ensure data models are optimized for reporting in Power BI, with a focus on performance and usability.
Establish data governance, quality, and security controls, ensuring compliance with GDPR, HIPAA, and global clinical data regulations.
Lead cost optimization and performance tuning initiatives for cloud data workflows across Azure and Snowflake.
Mentor and guide junior engineers, fostering technical excellence and knowledge sharing.
Drive automation and CI/CD practices within data engineering pipelines, integrating with version control and deployment workflows.
Work closely with Data Architects, Business Analysts, and Product Owners to translate business needs into technical solutions.
Required Qualifications:
Experience: 7+ years of data engineering experience, with at least 4 years hands-on in Azure, Databricks, and Snowflake; experience with Denodo and Power BI integration is highly desirable.
Education: Bachelor’s or master’s degree in computer science, Information Systems, Engineering, or a related field.
Skills:
Expert-level knowledge of Azure Data Factory, Databricks, and Snowflake.
Strong understanding of data virtualization concepts (Denodo experience preferred).
Proven experience in building BI-ready datasets and performance tuning in Power BI.
Proficient in SQL, Python, and cloud-native architecture.
Strong grasp of data security, privacy compliance, and best practices in a regulated environment.
TechnologyAdvice
John Deere
Scalian
Addepar
Payscale