Our client is a leading advisory and consulting firm known for delivering innovative technology solutions across insurance, finance, banking, and utilities sectors. With a reputation for excellence, they combine strategic insight with technical expertise to drive impactful digital transformation. This role supports a major initiative within the insurance industry, integrating the Fineos Claims Management system with enterprise-grade data pipelines and APIs.
About the Role
We're seeking a Senior Data Pipeline Engineer to join our engineering team. In this role, you will design, build, and maintain robust data pipelines that integrate with the Fineos Claims Management system. You will play a crucial role in ensuring data is accurate, timely, and accessible for analytics and operational purposes.
The ideal candidate possesses in-depth data engineering expertise, strong problem-solving skills, and a passion for working with complex enterprise systems within the insurance domain. This is a hands-on, high-impact position where you will also mentor junior engineers and establish best practices within the team.
About the Project
You will work alongside cross-functional stakeholders—including business analysts, product managers, and QA testers—to build scalable data pipelines on AWS and lead Fineos API integrations. The project is focused on extracting, transforming, and securing claims data, ensuring enterprise-grade data quality and compliance with PII standards.
Key Responsibilities
Design and develop scalable data pipelines using Python and AWS services (e.g., S3, Lambda, Glue, Redshift).
Lead the integration of the Fineos Claims API to extract and transform claims data.
Collaborate with business analysts and product managers to translate data requirements into technical solutions.
Ensure data quality and integrity by monitoring, testing, and validating data pipelines.
Troubleshoot and resolve issues related to data flow, performance, and API connectivity.
Mentor junior engineers and contribute to data engineering best practices.
Define and document data models and schemas with stakeholders.
Core Technical Stack (Must-Have)
Proven experience as a Data Pipeline Engineer or in a similar role.
Demonstrated expertise in Fineos Claims API integration and strong knowledge of the Fineos data model.
2+ years of hands-on experience in building and managing AWS-based pipelines.
Strong proficiency in Python for data manipulation and automation.
Practical experience with REST APIs for integration and automation.
Prior experience in the insurance industry, especially in claims systems.
Proven ability to securely handle PII data within automated processes.
Excellent communication skills to explain technical concepts to diverse audiences.
Self-starter with strong ownership and accountability.
Preferred Skills
Experience with other enterprise system integrations such as SAP, CPAY, FileNET, or CoFax.
Familiarity with Agile development methodologies.
Qualifications
Bachelor's degree in Computer Science, Engineering, or a related field.
Strong track record of delivering robust data pipeline solutions for enterprise systems.
Experience mentoring or guiding junior engineers is a plus.
What We Offer
A chance to work on high-impact projects at the intersection of insurance and advanced technology.
Collaboration with a talented engineering and QA team that values innovation and quality.
Opportunities for career growth into architecture or leadership roles.
Competitive salary and benefits package.
A supportive and dynamic environment with room to innovate.
To Apply
Please send your updated resume, along with examples of data engineering projects, Fineos API integrations, or AWS pipeline solutions that highlight your expertise.