Logo for Cotiviti

Data Integration Engineer

Roles & Responsibilities

  • 3+ years of hands-on experience with relational database systems (PostgreSQL).
  • 2+ years of hands-on experience developing ELT jobs/data pipelines using Argo Workflows, cron, Airflow, dbt, Great Expectations.
  • 2+ years of experience in Linux systems with proficiency in shell scripting and automation, and experience with Git version control.
  • Plus - container deployment platforms (Kubernetes, Docker, Helm, Terraform) and AWS Cloud experience (EC2, RDS, SQS, IAM, S3).

Requirements:

  • Design and develop data flows and extraction processes.
  • Integrate client's data into the Edifecs/Cotiviti Risk Adjustment Workflows.
  • Develop a new framework for data ELT jobs to scale implementation, monitoring, and quality.
  • Build and scale automation that orchestrate complex workflows.

Job description

Overview:

Edifecs/Cotiviti is seeking a Data Integration Engineer to join our innovative software teams. In this position, you will be responsible for onboarding customers to the Risk Adjustment workflow applications. As part of this role, you will work with platform engineering, product, and implementation teams. The ideal candidate has experience in building scalable data pipelines that enable workflow applications, reporting, and customer confidence. You must have strong, hands-on technical expertise in big data technologies and communication abilities with non-tech and technical audiences.

Responsibilities:
  • Design and develop data flows and extraction processes.
  • Integrate client’s data into the Edifecs/Cotiviti Risk Adjustment Workflows.
  • Work with relational databases and Python jobs.
  • Implement open-source standards for data quality.
  • Develop a new framework for data ELT jobs to scale implementation, monitoring, and quality.
  • Build and scale automation that orchestrate complex workflows.
  • Ability to support existing processes while leading efforts to redefine the data strategy.
Qualifications:
  •  3+ years of hands-on experience with relational database systems (PostgreSQL).
  • 2+ years of hands-on experience developing ELT jobs/data pipelines within various technologies such as: Argo Workflows, cron, Airflow, dbt, great expectations.
  • 2+ years of experience in Linux system.
  • Proficiency in shell scripting and automation.
  • Experience building reusable data methods and working with version control systems (git).
  • 3+ years of hands-on experience with programming languages such as python.
  • Plus - Experience with container deployment platforms and tools, such as Kubernetes, Docker, Helm, and Terraform.
  • Plus - AWS Cloud experience: EC2, RDS, SQS, IAM, S3.

Data Engineer Related jobs

Other jobs at Cotiviti

We help you get seen. Not ignored.

We help you get seen faster — by the right people.

🚀

Auto-Apply

We apply for you — automatically and instantly.

Save time, skip forms, and stay on top of every opportunity. Because you can't get seen if you're not in the race.

AI Match Feedback

Know your real match before you apply.

Get a detailed AI assessment of your profile against each job posting. Because getting seen starts with passing the filters.

Upgrade to Premium. Apply smarter and get noticed.

Upgrade to Premium

Join thousands of professionals who got noticed and hired faster.