Logo for Allata

Data Engineer (Azure+SQL)

Roles & Responsibilities

  • Hands-on experience with Azure Data Services.
  • Experience building and maintaining data pipelines using Azure Data Factory (ADF).
  • Strong proficiency in SQL and relational databases.
  • Experience with API integrations and cloud-based data workflows.

Requirements:

  • Design, build, and optimize ETL/ELT pipelines in Azure cloud environments.
  • Develop scalable and reliable data workflows using Azure Data Factory (ADF) and other Azure data services.
  • Build and maintain data solutions leveraging Azure SQL and related cloud-native services.
  • Design and implement robust data models to support analytics, reporting, and downstream integrations.

Job description

Allata is a global consulting and technology services firm with offices in the US, India, and Argentina. We help organizations accelerate growth, drive innovation, and solve complex challenges by combining strategy, design, and advanced technology. Our expertise covers defining business vision, optimizing processes, and creating engaging digital experiences. We architect and modernize secure, scalable solutions using cloud platforms and top engineering practices.

Allata also empowers clients to unlock data value through analytics and visualization and leverages artificial intelligence to automate processes and enhance decision-making. Our agile, cross-functional teams work closely with clients, either integrating with their teams or providing independent guidance—to deliver measurable results and build lasting partnerships.



We are seeking a skilled Data Engineer to join our team and contribute to data-driven initiatives within the healthcare industry. This role focuses on designing, building, and optimizing scalable data solutions that support analytics, reporting, and advanced data use cases in regulated environments.

The ideal candidate will bring strong hands-on experience building cloud-based data solutions within the Azure ecosystem, with the ability to support data ingestion, transformation, modeling, and integration needs across the platform.

 

Role & Responsibilities:
  • Design, build, and optimize ETL/ELT pipelines in Azure cloud environments.
  • Develop scalable and reliable data workflows using Azure Data Factory (ADF) and other Azure data services.
  • Build and maintain data solutions leveraging Azure SQL and related cloud-native services.
  • Design and implement robust data models to support analytics, reporting, and downstream integrations.
  • Develop and support API-based integrations across internal and external systems.
  • Implement data models following modern Data Lakehouse or layered architecture principles.
  • Ensure data quality, consistency, and performance across ingestion, staging, and curated layers.
  • Collaborate with data architects, analysts, and business stakeholders to translate healthcare data requirements into scalable technical solutions.
  • Develop reusable data transformation logic and modular processing components.
  • Support deployment processes following CI/CD and DevOps best practices.
  • Monitor and optimize data workflows for performance, scalability, and reliability.
  • Contribute to data governance, security, and compliance practices relevant to healthcare environments.

  • Hard Skills - Must have:
  • Hands-on experience with Azure Data Services.
  • Experience building and maintaining data pipelines using Azure Data Factory (ADF).
  • Strong working knowledge of Azure SQL and relational databases.
  • Experience with API integrations and cloud-based data workflows.
  • Solid understanding of Data Modeling concepts for analytics and reporting.
  • Strong proficiency in SQL.
  • Experience designing and supporting ETL/ELT processes in cloud environments.
  • Familiarity with modern data architecture concepts, such as data warehousing, lakehouse, or layered data models.
  • Ability to work with large datasets and build scalable, reliable data solutions.

  • Hard Skills - Nice to have/It's a plus:
  • Hands-on experience with Microsoft Fabric (MS Fabric).
  • Experience with Cloud Functions or similar serverless services.
  • Experience with Databricks in Azure environments.
  • Working knowledge of Python and/or PySpark.
  • Exposure to batch or streaming data processing patterns.
  • Familiarity with CI/CD and DevOps practices for data solutions.

  • Soft Skills / Business Specific Skills:
  • Strong analytical and problem-solving skills.
  • Ability to work effectively in cross-functional and distributed teams.
  • Clear communication skills, with the ability to explain technical concepts to non-technical stakeholders.
  • Proactive mindset with a strong sense of ownership.
  • Commitment to delivering high-quality, reliable, and scalable data solutions.
  • Ability to operate successfully in regulated environments, with attention to compliance and data integrity.
  • Data Engineer Related jobs

    Other jobs at Allata

    We help you get seen. Not ignored.

    We help you get seen faster — by the right people.

    🚀

    Auto-Apply

    We apply for you — automatically and instantly.

    Save time, skip forms, and stay on top of every opportunity. Because you can't get seen if you're not in the race.

    AI Match Feedback

    Know your real match before you apply.

    Get a detailed AI assessment of your profile against each job posting. Because getting seen starts with passing the filters.

    Upgrade to Premium. Apply smarter and get noticed.

    Upgrade to Premium

    Join thousands of professionals who got noticed and hired faster.