Logo for Tiger Analytics

Databricks Architect

Roles & Responsibilities

  • Strong experience in data architecture, specifically with Databricks architecture
  • Expert in Databricks Lakehouse (Delta Lake, Unity Catalog, MLflow), AWS, Snowflake, and Apache Iceberg
  • Strong MLOps and CI/CD expertise
  • Proficient in Python/Scala (Spark) for data governance, security, and enterprise data platform design

Requirements:

  • Architect and strategize Databricks on AWS/Snowflake integration for secure, scalable data/AI platforms
  • Architect seamless Snowflake/Databricks data flow via Apache Iceberg (including ML output)
  • Design cost-managed multi-region Databricks MLOps platform and data flows
  • Implement Unity Catalog for fine-grained access control and secure team coexistence with Workspaces

Job description

Tiger Analytics is a fast-growing advanced analytics consulting firm. Our consultants bring deep expertise in Data Science, Machine Learning and AI. We are the trusted analytics partner for several Fortune 100 companies, enabling them to generate business value from data. Our business value and leadership has been recognized by various market research firms, including Forrester and Gartner. We are looking for top-notch talent as we continue to build the best analytics global consulting team in the world.

This role will be responsible for Databricks Architecture, Designing and implementing best data practices.

Requirements

  • Strong Experience in Data Architecture, specifically with Databricks Architecture.
  • Expert in Databricks Lakehouse (Delta Lake, Unity Catalog, MLflow), AWS, Snowflake, and Apache Iceberg.
  • Strong MLOps & CI/CD Expertise
  • Proficient in Python/Scala (Spark) for data governance, security, and enterprise data platform design.
  • Databricks certifications and GenAI architecture experience preferred.

Key Responsibilities:

    • Architect and strategize Databricks on AWS/Snowflake integration for secure, scalable data/AI platforms.
    • Architect seamless Snowflake/Databricks data flow via Apache Iceberg (including ML output).
    • Design cost-managed multi-region Databricks MLOps platform and data flows.
    • Implement Unity Catalog for fine-grained access control and secure team coexistence with Workspaces.
    • Develop Databricks-native MLOps (environment parity, Git-driven CI/CD, governed access, MLflow governance, standardized deployment, monitoring).
    • Define enterprise-scalable AI governance for GenAI production deployment.

Benefits

This position offers an excellent opportunity for significant career development in a fast-growing and challenging entrepreneurial environment with a high degree of individual responsibility.


#LI-remote

Related jobs

Other jobs at Tiger Analytics

We help you get seen. Not ignored.

We help you get seen faster — by the right people.

🚀

Auto-Apply

We apply for you — automatically and instantly.

Save time, skip forms, and stay on top of every opportunity. Because you can't get seen if you're not in the race.

✨

AI Match Feedback

Know your real match before you apply.

Get a detailed AI assessment of your profile against each job posting. Because getting seen starts with passing the filters.

Upgrade to Premium. Apply smarter and get noticed.

Upgrade to Premium

Join thousands of professionals who got noticed and hired faster.