Senior Data Engineer

Work set-up: 
Full Remote
Contract: 
Experience: 
Senior (5-10 years)
Work from: 

Offer summary

Qualifications:

Master's degree in a technical field or equivalent experience., At least 3 years of experience in building production data pipelines., Proficiency in Python and SQL for data engineering., Experience with Databricks, Delta Lake, and AWS cloud infrastructure..

Key responsibilities:

  • Design and develop scalable data pipelines for ingestion and processing.
  • Integrate external data sources such as CRMs and APIs.
  • Build and maintain ETL workflows using Python and Databricks.
  • Collaborate with teams to ensure data infrastructure supports analytics and machine learning.

PLUM Commercial Real Estate Lending logo
PLUM Commercial Real Estate Lending https://plumlending.com/
11 - 50 Employees
See all jobs

Job description

PLUM is a fintech company empowering financial institutions to grow their business through a cutting-edge suite of AI-driven software, purpose-built for lenders and their partners across the financial ecosystem. We are a boutique firm, where each person’s contributions and ideas are critical to the growth of the company. 

This is a fully remote position, open to candidates anywhere in the U.S. with a reliable internet connection. While we gather in person a few times a year, this role is designed to remain remote long-term. You will have autonomy and flexibility in a flat corporate structure that gives you the opportunity for your direct input to be realized and put into action. You'll collaborate with a high-performing team — including sales, marketers, and financial services experts —  who stay connected through Slack, video calls, and regular team and company-wide meetings. We’re a team that knows how to work hard, have fun, and make a meaningful impact—both together and individually.

Job Summary

We are seeking a Senior Data Engineer to lead the design and implementation of scalable data pipelines that ingest and process data from a variety of external client systems. This role is critical in building the data infrastructure that powers Plum’s next-generation AI-driven products.

You will work with a modern data stack including Python, Databricks, AWS, Delta Lake, and more. As a senior member of the team, you’ll take ownership of architectural decisions, system design, and production readiness—working with team members to ensure data is reliable, accessible, and impactful.

Key Responsibilities
  • Design and architect end-to-end data processing pipelines: ingestion, transformation, and delivery to the Delta Lakehouse.
  • Integrate with external systems (e.g., CRMs, file systems, APIs) to automate ingestion of diverse data sources.
  • Develop robust data workflows using Python and Databricks Workflows.
  • Implement modular, maintainable ETL processes following SDLC best practices and Git-based version control.
  • Contribute to the evolution of our Lakehouse architecture to support downstream analytics and machine learning use cases.
  • Monitor, troubleshoot, and optimize data workflows in production.
  • Collaborate with cross-functional teams to translate data needs into scalable solutions.

Requirements

  • Master’s degree in Computer Science, Engineering, Physics, or a related technical field or equivalent work experience.
  • 3+ years of experience building and maintaining production-grade data pipelines.
  • Proven expertise in Python and SQL for data engineering tasks.
  • Strong understanding of lakehouse architecture and data modeling concepts.
  • Experience working with Databricks, Delta Lake, and Apache Spark.
  • Hands-on experience with AWS cloud infrastructure.
  • Track record of integrating data from external systems, APIs, and databases.
  • Strong problem-solving skills and ability to lead through ambiguity.
  • Excellent communication and documentation habits.

Preferred Qualifications
  • Experience building data solutions in Fintech, Sales Tech, or Marketing Tech domains.
  • Familiarity with CRM platforms (e.g., Salesforce, HubSpot) and CRM data models.
  • Experience using ETL tools such as Fivetran or Airbyte.
  • Understanding of data governance, security, and compliance best practices.

Benefits

  • A fast-paced, collaborative startup culture with high visibility.
  • Autonomy, flexibility, and a flat corporate structure that gives you the opportunity for your direct input to be realized and put into action. 
  • Opportunity to make a meaningful impact in building a company and culture. 
  • Equity in a financial technology startup. 
  • Generous health, dental, and vision coverage for employees and family members + 401K.
  • Eleven paid holidays and unlimited discretionary vacation days.
  • Competitive compensation and bonus potential.

Required profile

Experience

Level of experience: Senior (5-10 years)
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Communication
  • Problem Solving

Data Engineer Related jobs