Data Engineer at Masterworks

extra holidays
Work set-up: 
Full Remote
Contract: 
Experience: 
Mid-level (2-5 years)
Work from: 

Offer summary

Qualifications:

2–5 years of hands-on data engineering experience in a production environment., Strong expertise with Amazon Redshift, including query optimization and diagnostics., Proficiency with ETL orchestration tools such as Luigi and Apache Airflow., Expert-level SQL skills with the ability to analyze and optimize queries..

Key responsibilities:

  • Design, build, and maintain scalable ETL pipelines using Luigi and Apache Airflow.
  • Monitor and optimize Redshift cluster performance, diagnosing high CPU usage and slow queries.
  • Build data quality alerts and notification systems to ensure pipeline health.
  • Collaborate with analysts and stakeholders to ensure data accuracy, availability, and accessibility.

Masterworks logo
Masterworks Financial Services Scaleup https://masterworks.com/
201 - 500 Employees
See all jobs

Job description

About Masterworks

Masterworks is a fintech platform that allows anyone to invest in SEC-qualified shares of multi-million dollar paintings by names like Banksy, Basquiat, and Picasso. In just three short years, we have built a portfolio of nearly $800 million in world-class artworks, introducing over 800,000 individuals to the $1.7 trillion art market.

Masterworks has been covered by major media publications such as The New York Times, CNBC, The Wall Street Journal, and the Financial Times, and was recently recognized as one of the Top 50 Startups in the US by LinkedIn.

In 2021, Masterworks achieved unicorn status raising $110M in its Series A fundraising round at a valuation exceeding $1 billion.

Our 200+ employees are based out of our offices at 1 World Trade Center in the Financial District of New York City. With an entirely in-office team, there are endless opportunities for collaboration, innovation, and learning.

Why Masterworks?

  • Do you thrive on disruption?
  • Do you want to live at the cutting edge of finance, technology, and art?
  • Are you passionate about democratizing alternative investments?
  • Do you enjoy meaningful work that has a noticeable impact on business performance?

If you answered “Yes” to any of the above, we’d love to hear from you!

Position Overview

We're looking for a skilled and proactive Data Engineer with 2–5 years of experience to help build and maintain robust, high-performance data pipelines and infrastructure. This role is ideal for someone who thrives in a fast-paced environment, has deep technical knowledge of Redshift, and is capable of diagnosing and improving SQL query performance. You’ll work closely with investment advisors, engineering, product, and analytics teams to ensure the reliability, efficiency, and scalability of our data systems.

Key Responsibilities
  • Design, build, and maintain scalable ETL pipelines using Luigi and Apache Airflow
  • Monitor and optimize performance of Redshift clusters, particularly:
    • Diagnosing high CPU usage
    • Identifying slow or resource-intensive queries
    • Refactoring SQL for performance improvements
  • Proactively build data quality alerts and notification systems to ensure pipeline health and catch missing/incomplete data early
  • Work closely with analysts and stakeholders to ensure the data is accurate, available, and accessible
  • Respond promptly to issues during working hours (within 5 minutes during core hours)
  • Lead or assist in potential migration projects (e.g., Redshift to Snowflake or other tools), including planning, testing, and execution

Collaborate on data modeling and schema design to support analytics and application needs

Requirements
  • 2–5 years of hands-on data engineering experience in a production environment
  • Strong experience with Amazon Redshift, including query optimization and system diagnostics
  • Proficiency with ETL orchestration tools such as Luigi and Apache Airflow
  • Expert-level SQL skills; ability to analyze and optimize long-running queries
  • Proven ability to troubleshoot high CPU or slow query issues on Redshift
  • Familiarity with data alerting and monitoring tools (e.g., CloudWatch, Datadog, custom alert systems)
  • Strong communication skills and a collaborative mindset
  • High responsiveness during working hours; ability to support production data pipelines and address urgent issues quickly
  • Experience with cloud platforms (AWS preferred)

Bonus: Experience leading or supporting data platform migrations

Preferred Qualifications
  • Experience with data warehousing concepts and modern data stacks
  • Familiarity with data pipeline logging, testing, and observability best practices
  • Experience with Snowflake or other modern data platforms
  • Interest in or experience with financial data or investment platforms

Additional Requirements:

  • Must be eligible for full-time US work - no exceptions.
  • Must be able to work from our NY office - not a remote role.

Benefits:

  • Free admission to art museums and galleries
  • Health, dental, and vision coverage with FSA options
  • PTO and 401k
  • Discounted Equinox membership


Required profile

Experience

Level of experience: Mid-level (2-5 years)
Industry :
Financial Services
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Collaboration
  • Communication

Data Engineer Related jobs