Match score not available

10x Data Engineer at SFR3

Remote: 
Full Remote
Contract: 
Salary: 
10 - 13K yearly
Work from: 

Offer summary

Qualifications:

Strong expertise in Python, SQL, DBT, Snowflake, Experience with ETL tools like Fivetran, Hands-on experience with Apache Kafka, Knowledge of AWS and cloud-based services.

Key responsabilities:

  • Build end-to-end data and workflow orchestration solutions
  • Develop advanced AI/ML models for actionable insights

SFR3 Fund logo
SFR3 Fund Real Estate Management & Development Scaleup https://www.sfr3.com/
201 - 500 Employees
See all jobs

Job description

10x Data Engineer / Data Scientist
SFR3 is a boutique real estate investment fund managing $2B+ of affordable single-family homes. The Fund renovates distressed homes, using software-driven operations to run many “tertiary” markets concurrently. Today we own 10,500+ homes in 20 states.
You

You love building software that directly enables your team to crack critical business challenges. You’ve shipped production applications and can strike the right balance between shipping fast for immediate impact and long-term vision. You enjoy working remotely along with the independence and responsibility that comes with that.

Real estate is the world’s largest asset class, but experience in the domain is not required. What is necessary is motivation to dive in and understand the work people do every day so you can build the right tools that give them leverage. The idea of building & running a $150M/yr business that operates like a startup, growing at 5-7%/yr (not a startup looking for PMF) is exciting to you.

What You’ll Do

Buying, renovating, and managing a portfolio of thousands of homes worth billions of dollars is a big task. Our team is continuously experimenting to improve efficiency and accelerate operations.

You’ll join a pod that consists of seasoned business owners paired with product and have complete ownership over a domain to to crank out needle moving features weekly. There’s a lot for a small team to do, so we look for “T-shaped” engineers. You should be comfortable and confident across any stack, but also have deep expertise in one or more areas where you can be the team expert.

To manage single-family rentals at our scale, we write lots of software that runs our operations. For this dev pod, our users are hundreds of employees and vendors located in 40+ markets who renovate homes, repair issues for residents, and prepare our vacant homes for the next move-in.

In this role, you will:

  • Architect and Build End-to-End Data and Workflow Orchestration Solutions: Design and implement robust data pipelines—from ingestion and storage to transformation and analysis—that handle large, diverse data sets efficiently.
  • Define, Build, and Maintain Canonical Metrics Stores, to standardize how the company looks at metrics of company and business unit performance. ​Developing and implementing integrations with third-party data service APIs including companies such as Stripe, Brex, Pipefy, Contentful, Netsuite, and Ramp.
  • Develop Advanced AI & ML Models: Leverage machine learning and deep learning techniques to uncover insights, predict outcomes, and generate actionable recommendations that drive measurable business value.
  • Ensure Scalable and Reliable Infrastructure: Champion best practices for data engineering, including data governance, security, and performance optimization to support rapid iteration and growth.
  • Experimentation & Innovation: Stay on the cutting edge of AI/ML trends, continually researching new algorithms, tools, and data engineering frameworks to push the boundaries of what’s possible.
  • Model Lifecycle Management: Own the end-to-end ML lifecycle, from data exploration and model development to deployment and ongoing monitoring, ensuring models remain accurate and performant.
  • Analyze System Efficiency and Performance. We’re committed to continuous improvement and optimization. You’ll use data to monitor software performance and identify opportunities that drive business results and enhance system reliability.
  • Collaborate on Scoping and Prioritizing the Roadmap. We constantly ship and evaluate new problems and opportunities to improve workflows. You’ll regularly interact with users, ship fast, and help define our long-term tech vision—making trade-offs between effort and impact to focus on the most critical work.
Your Stats

You'll be working across multiple backend projects. You might be a great fit if these attributes describe you:

  • Experience building end to end data solutions that drive real-world processes and results.
  • Strong expertise in Python, SQL, DBT, Snowflake, and ETL tools such as Fivetran.
  • Hands-on experience with Apache Kafka for real-time data streaming.
  • Knowledge of AWS and cloud-based services.
  • Strong product sensibilities to build experiences users love.
  • AI-first sensibilities.
  • Entrepreneurial experience preferred.

Working timezone overlaps at least 5 working hours with US Eastern.

Required profile

Experience

Industry :
Real Estate Management & Development
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Collaboration
  • Communication
  • Problem Solving

Data Engineer Related jobs