Match score not available

Senior Data Engineer

Remote: 
Full Remote
Contract: 
Experience: 
Mid-level (2-5 years)
Work from: 

Offer summary

Qualifications:

7+ years in Data Engineering, Expertise in Python and SQL, Experience with cloud platforms like AWS, Azure or Google Cloud.

Key responsabilities:

  • Design, build & maintain data pipelines using Python/DBT/Airflow
  • Ensure projects meet business outcomes & optimize data systems performance
  • Stay updated on trends in data engineering & HealthTech standards
Medalogix logo
Medalogix SME https://medalogix.com/
11 - 50 Employees
See more Medalogix offers

Job description

We are looking for an experienced Senior Data Engineer to join our growing team of data experts. As a data engineer at MedaLogix, you will be responsible for developing, maintaining, and optimizing our data warehouse and data pipelines. The data engineer will support multiple stakeholders, including software developers, database architectures, data analysts, and data scientists, to ensure an optimal data delivery architecture. The ideal candidate should possess strong technical abilities to solve complex problems with data, a willingness to learn new technologies and tools if necessary, and be comfortable supporting the data needs of multiple teams, stakeholders, and products.

  • Design, build, and maintain batch or real-time data pipelines in production while adhering to architectural requirements of maintainability and scalability.
  • Build scalable data pipelines (Python/DBT) leveraging the Airflow scheduler/executor framework.
  • Responsible for ensuring that large and/or more complex engineering projects are delivered in alignment with the appropriate business outcomes.
  • Monitor data systems performance and implement optimization strategies.
  • Stay current with emerging trends and technologies in data engineering, and HealthTech industry standards.
  • Perform on-call off-hours support for critical systems.

Requirements

• 7+ years as a Data Engineer or related role, with a focus on designing and developing performant data pipelines.

• Intermediate/Expert-level knowledge of Kubernetes fundamentals like nodes, pods, services, deployments etc., and their interactions with the underlying infrastructure.

• 2+ years hands-on experience with Docker containerization technology to package applications for use in a distributed system managed by Kubernetes.

• 3+ years’ experience with orchestration platform airflow. Experience with Azure Data Factory is a plus but not required.

• Expertise in using DBT and Apache Airflow for orchestration and data transformation. • Strong programming skills in Python and SQL. Experience with Scala is a plus but not required.

• Strong experience with at least one cloud platform such as AWS, Azure, or Google Cloud

• Experience working with cloud Data Warehouse solutions (Snowflake).

• Excellent problem-solving, communication, and organizational skills.

• Proven ability to work independently and with a team.

• Approachable, personable and team player comfortable working in an Agile environment

• Experience with working on large data sets and distributed computing.

• Knowledge of EMR systems like HCHB/MatrixCare.

• Prior experience as a Senior Data Engineer within a healthcare SaaS group

Benefits

  • Health Care Plan (Medical, Dental & Vision)
  • Retirement Plan (401k, IRA)
  • Life Insurance (Basic, Voluntary & AD&D)
  • Paid Time Off (Vacation, Sick & Public Holidays)
  • Family Leave (Maternity, Paternity)

Required profile

Experience

Level of experience: Mid-level (2-5 years)
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Data Engineer Related jobs