Logo for Intrado Life & Safety

Data Engineer

Roles & Responsibilities

  • 5+ years of experience in Data Engineering, building and maintaining ETL/ELT pipelines of large-scale operational and financial data in a cloud environment.
  • Proficiency in building and optimizing data pipelines using Azure Data Factory and Databricks or comparable modern data orchestration and distributed processing frameworks.
  • Strong proficiency in SQL for data analysis and Python for scripting and transformation.
  • Experience implementing automated data quality checks (e.g., schema validation, null checks) and proactive fixes to prevent pipeline failures.

Requirements:

  • Pipeline Execution: Build and maintain Azure Data Factory pipelines to ingest data from multiple sources.
  • Silver Layer Transformation: Write Python code in Databricks to clean raw data and move it into the silver layer, handling deduplication, type casting, and validation.
  • Reliability: Monitor daily jobs and troubleshoot failures, acting as the first line of defense to ensure pipelines remain stable.
  • Data Quality: Implement automated checks to verify that data arriving in the data lake matches the source systems.

Job description

About Us:

Intrado is dedicated to saving lives and protecting communities, helping them prepare for, respond to, and recover from critical events. Our cutting-edge company strives to become the most trusted, data-centric emergency services partner by uniting fragmented communications into actionable intelligence for first responders. At Intrado, all of our work truly matters. 

Responsibilities:

We are seeking an exceptional Data Engineer to build the robust data pipelines that will power our company’s internal business analytics. Working under the guidance of the Staff Data Engineer, you will ensure that the raw data from multiple systems is consistently ingested, cleaned, and made ready for analysis. By building stable and efficient pipelines, you will directly support the timely generation of visualizations that leadership relies on to make informed decisions.

 

This is a demanding role in a results-oriented environment with high expectations for agency, speed, and ownership.

 

Key Responsibilities

  • Pipeline Execution: Build and maintain Azure Data Factory pipelines to ingest data from multiple sources.
  • Silver Layer Transformation: Write the Python code in Databricks to clean raw data and move it into the silver layer, handling deduplication, type casting, and validation.
  • Reliability: Monitor daily jobs and troubleshoot failures. You are the first line of defense in ensuring that pipelines are stable and do not break.
  • Data Quality: Implement automated checks to verify that data arriving in the lake matches the source systems.

Required Qualifications

  • Experience: 5+ years of experience in Data Engineering, specifically focused on building and maintaining ETL/ELT pipelines of large-scale operational and financial data in a cloud environment.
  • Pipeline Development: Proficiency in building and optimizing data pipelines using Azure Data Factory and Databricks or comparable modern data orchestration and distributed processing frameworks.  
  • Technical Proficiency (SQL & Python): Strong proficiency in SQL for data analysis and Python for scripting and transformation.
  • Data Quality Assurance: Experience implementing automated data quality checks (e.g., schema validation, null checks). A proactive approach to identifying pipeline failures and implementing fixes to prevent recurrence.
  • Platform & Data Familiarity: Experience working with data schemas and APIs from common enterprise platforms like Microsoft Dynamics 365 F&O, Salesforce, ServiceNow.
  • LLM Application: Demonstrated experience using LLMs to streamline data engineering workflows and improve development efficiency.
  • Education: Bachelor’s degree in Computer Science, Software Engineering, Data Engineering, or a closely related technical field.

Preferred Qualifications

  • Prior experience working in a technology company or SaaS environment
Total Rewards:

Want to love where you work? At Intrado, we offer a comprehensive benefits package that includes what you’d expect (medical, dental, vision, life and disability coverage, paid time off, a 401(k) retirement plan, and several that go above and beyond – paid parental leave, access to a robust library of personal and professional training resources, employee discounts, critical illness, hospital indemnity, access to legal support, pet insurance, identity theft protection, an EAP (Employee Assistance Program) that includes free mental health resources/support, and more! Apply today to join us in work worth doing

 

The starting salary is anticipated between $180,000 and $200,000 and will be commensurate with experience.  

 

Intrado is an Equal Opportunity Employer – Veterans/Disabled and Other Protected Categories. Our Company welcomes and encourages applications of individuals with disabilities. Accommodations are available on request for candidates taking part in all aspects of the selection process. Intrado maintains a Drug Free Workplace.

Data Engineer Related jobs

Other jobs at Intrado Life & Safety

We help you get seen. Not ignored.

We help you get seen faster — by the right people.

🚀

Auto-Apply

We apply for you — automatically and instantly.

Save time, skip forms, and stay on top of every opportunity. Because you can't get seen if you're not in the race.

AI Match Feedback

Know your real match before you apply.

Get a detailed AI assessment of your profile against each job posting. Because getting seen starts with passing the filters.

Upgrade to Premium. Apply smarter and get noticed.

Upgrade to Premium

Join thousands of professionals who got noticed and hired faster.