5+ years of experience in data architecture, data engineering, or related roles
Proven experience as a Data Architect in enterprise environments (financial services preferred)
Strong proficiency in Dataiku DSS, including production-grade data pipelines and workflows
Solid understanding of ETL/ELT processes, data modeling, and SQL
Requirements:
Collaborate with data teams to review, challenge, and optimize pipeline design and implementation
Lead architectural decisions and define best practices for scalable and maintainable Dataiku workflows
Provide strategic guidance on data wrangling, orchestration, and integration within Dataiku
Ensure solutions comply with enterprise data governance, security, and compliance requirements
Job description
Job Title: Dataiku Architect (Remote)
Position Type: Full-Time
Location: New York City, NY (candidates must reside within NY or NJ)
Job Description We are seeking a highly skilled Data Architect with strong expertise in designing and implementing data pipelines using Dataiku. The ideal candidate will play a key role in defining the data architecture, ensuring scalability, reliability, and performance of our client's data infrastructure, while collaborating with cross-functional teams to deliver business value from data. Key Responsibilities
Collaborate with data teams to review, challenge, and optimize pipeline design and implementation.
Lead architectural decisions and define best practices for scalable and maintainable Dataiku workflows.
Provide strategic guidance on data wrangling, orchestration, and integration within Dataiku.
Ensure solutions comply with enterprise data governance, security, and compliance requirements.
Act as a technical advisor for the Data Management team, supporting design and delivery.
Document architectural decisions and promote knowledge sharing across teams.
Qualifications
5+ years of experience in data architecture, data engineering, or related roles.
Proven experience as a Data Architect in enterprise environments (financial services preferred).
Strong proficiency in Dataiku DSS, including production-grade data pipelines and workflows.
Solid understanding of ETL/ELT processes, data modeling, and SQL.
Hands-on expertise with the Azure data ecosystem (Data Factory, Synapse, Azure SQL, Data Lake, etc.).
Strong proficiency in Python and SQL scripting; experience with API integrations.
Ability to evaluate multiple technical approaches and recommend optimal solutions.
Exposure to data visualization and data engineering best practices.
Excellent communication and mentorship skills.
Preferred Qualifications
Experience working in financial institutions.
Dataiku certification or advanced expertise in automation, APIs, and deployment.
Experience in data migration or modernization projects.
Familiarity with Snowflake or Databricks (nice to have, not required)