Logo for Interlink Cloud Advisors

Data Engineer / Azure Fabric Engineer Consultant

Key Facts

Remote From: 
Freelance
English

Other Skills

  • Communication
  • Teamwork
  • Troubleshooting (Problem Solving)
  • Consulting
  • Problem Solving

Roles & Responsibilities

  • Experience designing and delivering modern analytics platforms (Lakehouse and/or data warehouse patterns) on Azure and/or Microsoft Fabric, including client-facing delivery.
  • Strong SQL skills with experience building transformations and dimensional/analytical models.
  • Hands-on experience with Spark (PySpark/Spark SQL) and/or KQL for data engineering and analytics workloads.
  • AI development experience, including designing agentic solutions (multi-step/tool-using workflows) that leverage enterprise data (e.g., retrieval-augmented generation patterns).

Requirements:

  • Lead client discovery to understand business goals, data landscape, constraints, and success metrics, and translate findings into a delivery plan.
  • Design and implement data architectures (data lake, Lakehouse, data warehouse) on Azure and Microsoft Fabric, aligned to client needs, and develop ingestion, transformation, and orchestration pipelines.
  • Build AI-enabled analytics and data products, including agentic workflows, ensuring solutions are scalable, secure, and aligned with responsible AI practices.
  • Establish data quality, governance, security, monitoring, and operational runbooks, and facilitate workshops and stakeholder communications to support adoption and handoff.

Job description

Description

  

Interlink Cloud Advisors is looking for a Data Engineer / Azure Fabric Engineer Consultant to help clients build modern, scalable data platforms on Microsoft Azure and Microsoft Fabric. In this client-facing role, you’ll lead discovery, design pragmatic architectures, and deliver end-to-end solutions that turn raw data into trusted, analytics-ready datasets for BI, reporting, and AI. You’ll also bring AI development and agentic solution design experience to help clients operationalize intelligent applications on top of governed data. You’ll work across Fabric (OneLake, Lakehouse, Warehouse) and Azure data services to ensure solutions are secure, governed, and operationally ready.


Key Responsibilities

What you’ll do:

  • You’ll lead client discovery to understand business goals, data landscape, constraints, and success metrics—then translate findings into a clear delivery plan.
  • You’ll design and recommend data lake, Lakehouse, data warehouse, and analytical data store architectures on Azure and Microsoft Fabric aligned to client needs.
  • You’ll implement (and/or guide implementation of) ingestion, transformation, and orchestration pipelines using Microsoft Fabric experiences and Azure services (e.g., Data Factory, Synapse, Databricks).
  • You’ll integrate data from operational systems, APIs, and third-party sources and transform raw data into curated datasets ready for consumption.
  • You’ll build and maintain analytics-ready data models and schemas; implement enrichment, business rules, and reusable patterns for consistency across projects.
  • You’ll enable multiple compute engines (SQL, Spark/PySpark, Spark SQL, KQL) to run against shared datasets, supporting diverse analytics scenarios.
  • You’ll design and build AI-enabled analytics and data products, including agentic workflows (e.g., retrieval-augmented generation patterns), ensuring solutions are scalable, secure, and aligned with responsible AI practices.
  • You’ll establish data quality practices (validation, reconciliation, testing) and monitoring, troubleshoot failures, and continuously improve pipeline resiliency.
  • You’ll optimize transformations, query performance, and pipeline execution to meet client SLAs and cost objectives.
  • You’ll implement logging, metrics, and operational runbooks, and support deployments, cutovers, and hypercare as needed.
  • You’ll design and implement security, governance, and compliance controls (access, least privilege, data protection, lineage) across Azure and Fabric.
  • You’ll facilitate workshops and communicate architecture tradeoffs, partnering with client stakeholders, architects, analysts, and BI developers to deliver trusted datasets.
  • You’ll produce clear consulting deliverables (design docs, diagrams, implementation guides) and enablement materials to support adoption and smooth handoff.
Requirements

Required Qualifications

  • Experience designing and delivering modern analytics platforms (Lakehouse and/or data warehouse patterns) on Azure and/or Microsoft Fabric, including client-facing delivery.
  • Strong SQL skills with experience building transformations and dimensional/analytical models.
  • Hands-on experience with Spark (PySpark/Spark SQL) and/or KQL for data engineering and analytics workloads.
  • AI development experience, including designing agentic solutions (multi-step/tool-using workflows) that leverage enterprise data (e.g., retrieval-augmented generation patterns).
  • Experience building reliable data pipelines, including orchestration, scheduling, error handling, and automated monitoring.
  • Knowledge of data quality practices (validation, reconciliation, testing) and incident triage/root-cause analysis.
  • Working knowledge of Power BI (including semantic modeling concepts) to support end-to-end analytics delivery; Power Apps experience is a plus.
  • Understanding of security and governance concepts for data platforms (access controls, least privilege, compliance, lineage).
  • Strong communication skills, with the ability to present recommendations, align stakeholders, and manage expectations in a client environment. 


Preferred Qualifications

  • Direct experience with Microsoft Fabric components (OneLake, Lakehouse, Warehouse, Data Factory in Fabric, Real-Time Analytics/KQL databases), including platform administration, capacity planning, and standards.
  • Experience with Azure services commonly used in data platforms (e.g., Azure Data Factory, Synapse, Databricks, ADLS Gen2, Key Vault).
  • Experience with Microsoft Foundry.
  • Experience with Copilot, Copilot Studio, Azure AI services used to build AI/agentic applications (e.g., Azure OpenAI, Azure AI Search) and related practices like prompt engineering, evaluation, and observability.
  • Experience implementing CI/CD and Infrastructure as Code, including environment strategy and release management for data solutions.
  • Familiarity with semantic modeling and BI enablement patterns (e.g., Power BI datasets, self-service analytics, and governance guardrails).
  • Relevant Microsoft certifications (e.g., Azure Data Engineer, Fabric Analytics Engineer) and experience mentoring/enabling client teams.

Consultant Related jobs

Other jobs at Interlink Cloud Advisors

We help you get seen. Not ignored.

We help you get seen faster — by the right people.

🚀

Auto-Apply

We apply for you — automatically and instantly.

Save time, skip forms, and stay on top of every opportunity. Because you can't get seen if you're not in the race.

AI Match Feedback

Know your real match before you apply.

Get a detailed AI assessment of your profile against each job posting. Because getting seen starts with passing the filters.

Upgrade to Premium. Apply smarter and get noticed.

Upgrade to Premium

Join thousands of professionals who got noticed and hired faster.