Logo for 1950Labs

Data Architect #6632

Job description

Client Description

The client is a global organization in the tourism industry, offering river, ocean, and expedition cruises for passengers worldwide and operating a large fleet of vessels.

The company is currently undergoing an extensive cloud data modernization and unification program. We support them across data architecture, BI, migration, and data platform development.

A key focus area is the migration to Databricks Unity Catalog, including:

  • Migrating all data layers (landing, raw, prepared, reporting, services) from Hive Metastore to Unity Catalog
  • Migrating DLT (Delta Live Tables) and Python/SQL jobs into Databricks
  • Migrating pipelines in Azure Synapse/ADF
  • Rebuilding and adapting metadata frameworks
  • Standardizing access, lineage, governance, and overall Lakehouse structure

The client has very high technical expectations and is looking for top-level specialists capable of leading complex architectural initiatives.

Technical Requirements

  • Advanced knowledge of Microsoft Azure (data infrastructure, networking, authorization, cloud design)
  • Experience with Azure Synapse (especially Synapse Serverless and pipelines)
  • Strong expertise in Databricks (DLT, workflows, workspace administration)
  • Ability to design Data Lakehouse architectures (Medallion Architecture, Metadata-Driven ETL)
  • Very good knowledge of Python and code optimization
  • Strong SQL skills, including query optimization and SQL Server experience
  • Experience with Apache Spark (data processing workflows)
  • Experience building ETL/ELT processes and data warehouses
  • Experience with CI/CD processes (Azure DevOps, Git, branching strategies)
  • Experience implementing logging, monitoring, and optimization of data processes
  • Familiarity with Power BI and analytics workflows
  • Strong communication skills and documentation ability
  • Experience as a Lead Engineer (technical leadership, decision-making, stakeholder collaboration)

Scope of Responsibilities

  • Design and develop data architecture in Azure and Databricks environments
  • Participate in the Unity Catalog transformation (infrastructure, pipelines, frameworks, standards)
  • Migrate and modernize ETL/ELT processes (DLT, Python/SQL jobs, Synapse/ADF pipelines)
  • Design and implement Data Lakehouse solutions using Medallion architecture
  • Optimize data processing workflows (Python, SQL, Spark)
  • Build CI/CD processes and automation in Azure DevOps
  • Implement standards for logging, monitoring, and data quality
  • Lead the project from a technical perspective (Lead Engineer role)
  • Collaborate with business and technical stakeholders
  • Document solutions and mentor team members

Data Architect Related jobs

Other jobs at 1950Labs

We help you get seen. Not ignored.

We help you get seen faster β€” by the right people.

πŸš€

Auto-Apply

We apply for you β€” automatically and instantly.

Save time, skip forms, and stay on top of every opportunity. Because you can't get seen if you're not in the race.

✨

AI Match Feedback

Know your real match before you apply.

Get a detailed AI assessment of your profile against each job posting. Because getting seen starts with passing the filters.

Upgrade to Premium. Apply smarter and get noticed.

Upgrade to Premium

Join thousands of professionals who got noticed and hired faster.