Senior Data Platform Engineer

Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

5+ years experience in platform engineering, data engineering, or a data-facing role., Bachelor's degree in a quantitative field such as Computer Science, Engineering, or Mathematics/Statistics., Deep knowledge of the data ecosystem and ability to collaborate cross-functionally., Experience with Python data stack and cloud-based data pipeline management is preferred..

Key responsabilities:

  • Architect and build robust, scalable data pipelines for various use cases.
  • Develop and maintain high-performance APIs using FastAPI for data services.
  • Collaborate with software engineers, data scientists, and product teams to translate requirements into engineering solutions.
  • Monitor data flows and platform services to ensure health, quality, and reliability.

Apollo.io logo
Apollo.io Large https://www.apollo.io/demo
501 - 1000 Employees
See all jobs

Job description

Apollo.io is the leading go-to-market solution for revenue teams, trusted by over 500,000 companies and millions of users globally, from rapidly growing startups to some of the world's largest enterprises. Founded in 2015, the company is one of the fastest growing companies in SaaS, raising approximately $250 million to date and valued at $1.6 billion. Apollo.io provides sales and marketing teams with easy access to verified contact data for over 210 million B2B contacts and 35 million companies worldwide, along with tools to engage and convert these contacts in one unified platform. By helping revenue professionals find the most accurate contact information and automating the outreach process, Apollo.io turns prospects into customers. Apollo raised a series D in 2023 and is backed by top-tier investors, including Sequoia Capital, Bain Capital Ventures, and more, and counts the former President and COO of Hubspot, JD Sherman, among its board members.

As a Senior Data Platform Engineer, you will play a key role in designing and building the foundational data infrastructure and APIs that power our analytics, machine learning, and product features. You’ll be responsible for developing scalable data pipelines, managing cloud-native data platforms, and creating high-performance APIs using FastAPI to enable secure, real-time access to data services. This is a hands-on engineering role with opportunities to influence architecture, tooling, and best practices across our data ecosystem. 

Daily Adventures and Responsibilities
  • Architect and build robust, scalable data pipelines (batch and streaming) to support a variety of internal and external use cases
  • Develop and maintain high-performance APIs using FastAPI to expose data services and automate data workflows
  • Design and manage cloud-based data infrastructure, optimizing for cost, performance, and reliability
  • Collaborate closely with software engineers, data scientists, analysts, and product teams to translate requirements into engineering solutions
  • Monitor and ensure the health, quality, and reliability of data flows and platform services
  • Implement observability and alerting for data services and APIs (think logs, metrics, dashboards)
  • Continuously evaluate and integrate new tools and technologies to improve platform capabilities
  • Contribute to architectural discussions, code reviews, and cross-functional projects
  • Document your work, champion best practices, and help level up the team through knowledge sharing

Competencies
  • Excellent communication skills to work with engineering, product, and business owners to develop and define key business questions and build data sets that answer those questions.
  • Self-motivated and self-directed
  • Inquisitive, able to ask questions and dig deeper
  • Organized, diligent, and great attention to detail
  • Acts with the utmost integrity
  • Genuinely curious and open; loves learning
  • Critical thinking and proven problem-solving skills required
Skills & Relevant Experience

Required:

  • 5+ years experience in platform engineering, data engineering or in a data facing role
  • Experience in building data applications
  • Deep knowledge of data eco system with an ability to collaborate cross-functionally
  • Bachelor's degree in a quantitative field (Physical / Computer Science, Engineering or Mathematics / Statistics)

Preferred:

  • Experience using the Python data stack
  • Experience deploying and managing data pipelines in the cloud
  • Experience working with technologies like Airflow, Hadoop and Spark
  • Understanding of streaming technologies like Kafka, Spark Streaming



Why You’ll Love Working at Apollo

At Apollo, we’re driven by a shared mission: to help our customers unlock their full revenue potential. That’s why we take extreme ownership of our work, move with focus and urgency, and learn voraciously to stay ahead.

We invest deeply in your growth, ensuring you have the resources, support, and autonomy to own your role and make a real impact. Collaboration is at our core—we’re all for one, meaning you’ll have a team across departments ready to help you succeed. We encourage bold ideas and courageous action, giving you the freedom to experiment, take smart risks, and drive big wins.

If you’re looking for a place where your work matters, where you can push boundaries, and where your career can thrive—Apollo is the place for you.

Required profile

Experience

Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Critical Thinking
  • Self-Motivation
  • Communication
  • Problem Solving

Data Engineer Related jobs