Logo for Allata

Data Engineer (Databricks + Python + Azure)

Roles & Responsibilities

  • Proficiency with Databricks (PySpark) and Python for building scalable data pipelines
  • Strong foundation in data architecture, data integration, data warehousing, and ETL/ELT processes
  • Applied experience with SQL, stored procedures, and PySpark across data platforms
  • Experience with cloud/hybrid relational databases (e.g., MS SQL Server, PostgreSQL, Oracle, Azure SQL, AWS RDS)

Requirements:

  • Design, develop, and maintain scalable data pipelines using Databricks (PySpark) and Python
  • Build and optimize ETL/ELT processes within Azure cloud environments
  • Implement data models following Data Lakehouse principles (Medallion architecture) and ensure data quality across ingestion, staging, and curated layers
  • Collaborate with data architects, analysts, and stakeholders to translate healthcare data requirements into technical solutions; monitor and optimize data workflows with CI/CD and DevOps best practices

Job description

Allata is a global consulting and technology services firm with offices in the US, India, and Argentina. We help organizations accelerate growth, drive innovation, and solve complex challenges by combining strategy, design, and advanced technology. Our expertise covers defining business vision, optimizing processes, and creating engaging digital experiences. We architect and modernize secure, scalable solutions using cloud platforms and top engineering practices.

Allata also empowers clients to unlock data value through analytics and visualization and leverages artificial intelligence to automate processes and enhance decision-making. Our agile, cross-functional teams work closely with clients, either integrating with their teams or providing independent guidance—to deliver measurable results and build lasting partnerships.

We are seeking a skilled Data Engineer to join our team and contribute to data-driven initiatives within the healthcare industry. This role focuses on designing, building, and optimizing scalable data solutions that support analytics, reporting, and advanced data use cases in regulated environments.
 

Role & Responsibilities:
  • Design, develop, and maintain scalable data pipelines using Databricks (PySpark) and Python.
  • Build and optimize ETL/ELT processes within Azure cloud environments.
  • Implement data models following modern Data Lakehouse principles (e.g., Medallion architecture).
  • Ensure data quality, consistency, and performance across ingestion, staging, and curated layers.
  • Collaborate with data architects, analysts, and business stakeholders to translate healthcare data requirements into technical solutions.
  • Develop reusable data transformation logic and modular processing components.
  • Support deployment processes following CI/CD and DevOps best practices.
  • Monitor and optimize data workflows for performance, scalability, and reliability.
  • Contribute to data governance, security, and compliance practices relevant to healthcare environments.

  • Hard Skills - Must have:
  • Current knowledge of an using modern data tools like (Databricks,FiveTran, Data Fabric and others); Core experience with data architecture, data integrations, data warehousing, and ETL/ELT processes 
  • Applied experience with developing and deploying custom whl and or in session notebook scripts for custom execution across parallel executor and worker nodes 
  • Applied experience in SQL, Stored Procedures, and Pysparkbased on area of data platform specialization. 
  • Strong knowledge of cloud and hybrid relational database systems, such as MS SQL Server, PostgresSQL, Oracle, Azure SQL, AWS RDS, Auroraor a comparable engine. 
  • Strong experience with batch and streaming data processing techniques and file compactization strategies.   

  • Hard Skills - Nice to have/It's a plus:
  • Strong hands-on experience with Databricks in Azure environments.
  • Advanced proficiency in Python and PySpark for distributed data processing.
  • Experience building and optimizing data pipelines in Azure (Azure Data Factory, Azure SQL, Data Lake Storage, etc.).
  • Solid understanding of data warehousing, data lakehouse concepts, and ETL/ELT frameworks.
  • Experience working with relational databases such as SQL Server, PostgreSQL, Oracle, or similar.
  • Knowledge of batch and streaming data processing patterns.
  • Experience working with large, complex datasets in cloud-based distributed environments.

  • Soft Skills / Business Specific Skills:
  • Strong analytical and problem-solving skills.
  • Ability to work effectively in cross-functional and distributed teams.
  • Clear communication skills, with the ability to explain technical concepts to non-technical stakeholders.
  • Proactive mindset with a strong sense of ownership.
  • Commitment to delivering high-quality, reliable data solutions.
  • At Allata, we value differences.

    Allata is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.

    Allata makes employment decisions without regard to race, color, creed, religion, age, ancestry, national origin, veteran status, sex, sexual orientation, gender, gender identity, gender expression, marital status, disability or any other legally protected category.

    This policy applies to all terms and conditions of employment, including but not limited to, recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation, and training.

    Data Engineer Related jobs

    Other jobs at Allata

    We help you get seen. Not ignored.

    We help you get seen faster — by the right people.

    🚀

    Auto-Apply

    We apply for you — automatically and instantly.

    Save time, skip forms, and stay on top of every opportunity. Because you can't get seen if you're not in the race.

    AI Match Feedback

    Know your real match before you apply.

    Get a detailed AI assessment of your profile against each job posting. Because getting seen starts with passing the filters.

    Upgrade to Premium. Apply smarter and get noticed.

    Upgrade to Premium

    Join thousands of professionals who got noticed and hired faster.