Logo for MeridianLink

Principal Data Architect - AI

Roles & Responsibilities

  • 12–15+ years of progressive experience in data engineering, data warehousing, and data architecture with several years at the architect level
  • Demonstrated experience as a Data Architect at a SaaS company in FinTech or financial services software (lending, banking, payments, etc.)
  • Deep, hands-on expertise with Databricks and PySpark on Azure, including Delta Lake, Unity Catalog, and structured streaming, with performance tuning at scale
  • Production experience with Informatica Data Management Cloud (IDMC) or comparable enterprise integration platforms, for ingestion, transformation, and metadata-driven pipelines; plus end-to-end data modeling (conceptual, logical, physical)

Requirements:

  • Define the enterprise data architecture, owning conceptual, logical, and physical data models for MeridianLink's analytical and operational data platform and ensuring source-aligned, integrated, and consumption-ready layers
  • Build and maintain a meta-model capturing entities, relationships, business definitions, ownership, lineage, sensitivity classifications, and SLAs, wired into tooling
  • Drive the lakehouse strategy by architecting a Delta Lake medallion (bronze/silver/gold) pattern and setting standards for partitioning, schema evolution, slowly changing dimensions, and historical reproducibility
  • Be hands-on by writing PySpark/SQL/Delta Lake code, building reference implementations, reviewing pull requests, prototyping patterns, and leading data integration design across IDMC and Databricks pipelines while partnering with governance, security, and business teams

Job description

Principal Data Architect

About the Role

Reporting to the Vice President of Data, the Principal Data Architect is the senior technical authority for how data is modeled, integrated, governed, and consumed across MeridianLink. You will design our enterprise data architecture end-to-end — from source-system ingestion through our Azure Databricks lakehouse and into the analytical, operational, and customer-facing data products our business depends on.

This is a hands-on role. You will not only define the meta-models, conceptual models, logical models, and physical schemas that govern our data — you will build them, prove them out in code, partner closely with data engineers as they implement them, and evolve them as the business grows. The right candidate has done this before in a FinTech SaaS environment and understands the trade-offs that come with multi-tenant data, regulated workloads, and customer-facing analytics.

What You Will Do

• Define the enterprise data architecture: Own the conceptual, logical, and physical data models for MeridianLink's analytical and operational data platform, including source-aligned, integrated, and consumption-ready layers.

• Build the meta-model: Design and maintain a meta-model that captures entities, relationships, business definitions, ownership, lineage, sensitivity classifications, and SLAs — and make sure it is wired into our tooling, not stuck in a slide deck.

• Drive the lakehouse strategy: Architect our medallion (bronze / silver / gold) Delta Lake patterns on Databricks; define standards for partitioning, clustering, schema evolution, slowly changing dimensions, and historical reproducibility.

• Be hands-on: Write PySpark, SQL, and Delta Lake code. Build reference implementations, prototype patterns, review pull requests, and personally model critical domains rather than delegating every detail.

• Lead data integration design: Set patterns for ingestion through Informatica Data Management Cloud (IDMC) and direct Databricks pipelines, including CDC, batch, streaming, and API-based sourcing from our SaaS products and third-party systems.

• Champion data governance and lineage: Partner with data governance, security, and compliance leaders to operationalize cataloging, lineage, classification, masking, and access controls across the platform (Unity Catalog, IDMC, and adjacent tools).

• Standardize data modeling practices: Establish the standards, naming conventions, and review processes used by the Data Engineering team. Coach engineers on dimensional modeling, Data Vault, and other techniques where they best fit the use case.

• Partner across the business: Work closely with Product, Engineering, Analytics, ML, Finance, Risk, and Customer-facing teams to translate business needs into durable data designs.

• Influence the roadmap: Identify gaps in tooling, capability, and skill; propose investments; and drive multi-quarter initiatives that materially improve how MeridianLink uses its data.

Required Qualifications

• 12–15+ years of progressive experience in data engineering, data warehousing, and data architecture roles, with at least the most recent several years at the architect level.

• Demonstrated experience as a Data Architect at a SaaS company in the FinTech or financial services software space (lending, banking, payments, capital markets, insurance, or a closely related domain).

• Deep, hands-on expertise with Databricks and PySpark on Azure, including Delta Lake, Unity Catalog, structured streaming, and performance tuning at scale.

• Production experience with Informatica Data Management Cloud (IDMC) — or comparable enterprise integration platforms — for ingestion, transformation, and metadata-driven pipelines.

• Proven track record of designing and implementing detailed meta-models and end-to-end data models (conceptual, logical, and physical) that have shipped to production and stood up over time.

• Strong command of dimensional modeling (Kimball), Data Vault 2.0, and modern lakehouse patterns, including the ability to choose the right approach for the right use case.

• Expert SQL skills and strong proficiency in Python/PySpark; comfortable writing the code, not just the diagrams.

• Demonstrated experience implementing data governance, lineage, and metadata management programs (e.g., Unity Catalog, IDMC Data Governance, Collibra, Atlan, or similar).

• Working knowledge of FinTech-relevant regulatory and compliance considerations (e.g., GLBA, SOC 2, PCI, NIST, state lending regulations) and how they shape data design.

• Excellent written and verbal communication skills; able to explain complex data concepts to engineers, executives, customers, and auditors.

Preferred Qualifications

• Prior experience designing data architectures for multi-tenant SaaS platforms with customer-facing analytics or embedded reporting.

• Experience supporting Loan Origination, deposit account opening, or other consumer lending workflows and the underlying data domains (applicants, applications, decisions, funding, servicing, credit data, fraud, KYC/AML).

• Experience building feature stores or curated data products that serve both ML/AI workloads and BI consumers.

• Familiarity with Azure data services (ADLS Gen2, Azure Data Factory, Event Hubs, Synapse, Purview) and their interplay with Databricks.

• Experience with dbt, Great Expectations, or other modern data quality and transformation tooling layered on top of Databricks.

• Bachelor's or Master's degree in Computer Science, Information Systems, Engineering, or a related field, or equivalent professional experience.

Our Data Stack

• Lakehouse: Azure Databricks, Delta Lake, Unity Catalog, PySpark, SQL

• Integration: Informatica Data Management Cloud (IDMC)

• Cloud: Microsoft Azure (ADLS Gen2, Azure Data Factory, Event Hubs, Key Vault)

• BI & Consumption: Modern BI tooling, embedded analytics, ML feature delivery

• Governance: Unity Catalog, IDMC governance, lineage, and data quality controls

Data Architect Related jobs

Other jobs at MeridianLink

We help you get seen. Not ignored.

We help you get seen faster — by the right people.

🚀

Auto-Apply

We apply for you — automatically and instantly.

Save time, skip forms, and stay on top of every opportunity. Because you can't get seen if you're not in the race.

AI Match Feedback

Know your real match before you apply.

Get a detailed AI assessment of your profile against each job posting. Because getting seen starts with passing the filters.

Upgrade to Premium. Apply smarter and get noticed.

Upgrade to Premium

Join thousands of professionals who got noticed and hired faster.