Master's or Bachelor's degree in Information Systems, MIS, Statistics or related field., 5+ years of experience in data engineering and big data solutions., Strong skills in programming languages like Python, Go, or Java., Experience with AWS/Google Big Data platforms..
Key responsabilities:
Build and maintain data lakes for scalable data processing.
Develop tools to support ML and analytical models.
Report This Job
Help us maintain the quality of our job listings. If you find any issues with this job post, please let us know.
Select the reason you're reporting this job:
NexTurn is a next-generation engineering services firm specialized in providing cloud native solutions. We help clients accelerate their innovation & digital transformation journey by unlocking full value of cloud.
We are a team of passionate technologists with strong experience in architecting, developing, deploying, and operating large scale modern applications & infrastructure in a hybrid multi-cloud environment. Our consultants/architects are high performing engineering talent with integrated full stack competencies, have strong technology transformation experience and operate with a product-centric mindset.
NexTurn enables clients navigate the paradigm shift in digital engineering with Cloud-First solutions. Our services are aligned with Client’s digital transformation journey covering setting up strong cloud foundation & Security practices, Cloud Native Engineering and Data Engineering. Also, our platforms driven approach across the engineering lifecycle accelerates experimentation, creates new value, and drives intelligent automation.
is a forward thinking, growth-oriented healthcare services and technology company that provides state of the art pharmacy solutions. Since 2015, we have helped millions of consumers save on their prescription drug costs, and we believe we have only scratched the surface.
We occupy a unique position in the market because we are vertically integrated. We have a PBM platform (RxAgile) that provides enterprise solutions to B2B players in the healthcare space, a direct-to-consumer product (SingleCare) with a mission to make prescription medication more affordable, and an analytics platform (RxIQ) which provides actionable insights in real-time.
Primary Duties And Responsibilities
Build and maintain one or more data lakes to support scalable ingesting, manipulation, and reporting of data
Manipulate data to produce and maintain new data elements using repeatable, automated processes
Demonstrates knowledge of industry trends, our infrastructure, technologies, tools, and systems
Experience in measuring and communicating the value of data platforms and tools
Display sense of ownership over assigned work, requiring minimal direction and driving to completion in a sometimes fuzzy and uncharted environment
Build, operate and maintain highly scalable and reliable data pipelines
Build Datawarehouse solutions that provide end-to-end management and traceability of patient data, enable and optimize internal processes and product features.
Implement processes and systems to monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it.
Build and develop tools to support the use of ML and other analytical models to improve understanding of patient behavior, provider prescribing, the patient experience on treatment, treatment patterns and more.
Collaborate with internal stakeholders to develop business domain concepts and data modeling approaches to problems faced by the organization in the analytics arena.
Maintain and optimize existing data platform services and capabilities to identify potential enhancements, performance improvements, design improvements.
Writes & maintain unit/integration tests, systems documentation.
Desired Skills And Experience
Masters or a bachelor’s degree in Information Systems, MIS, Statistics, related field or equivalent work experience required.
5+ years of overall experience in building and sustaining data engineering and big data solutions, preferably in the healthcare industry
Extremely strong skills with at least 3+ years experience in at-least one programming and scripting language (Python, Go, Java)
Has built and deployed into production large-scale batch and real-time data pipelines using Airflow
Deep experience with AWS/Google Big Data platform and services. (SnowFlake, BigQuery, S3, Google Buckets, Parquet/Avro/ORC, ES)
Current Stack includes Python, Snowflake, BigQuery, AWS S3, Google Bucket, Airflow, ES, Redis
Overlap and experience with current stack is preferred and is a plus
3+ years experience in building enterprise data solutions using industry standard guiding principles and practices
2+ years of working knowledge with relational/non-relational databases
1+ years of experience in data engineering model that follows DevOps principles and standards for CI/CD processes
Organizational Skills Required
Ability to multi-task, prioritize assignments and work well under deadlines in a changing environment with cross functional agile teams.
Strong communications skills for working with stakeholders with various backgrounds
Location: Remote first job, but where the client mandates Work From Office, Candidate needs to relocate.
Kindly Submit Your Resume
Name *
Email*
Phone Number*
Resume*
Only PDF / Doc / Text Files are acceptable
Required profile
Experience
Level of experience:Senior (5-10 years)
Spoken language(s):
English
Check out the description to know which languages are mandatory.