Logo for paiqo

Azure Data Engineer II - Agentic AI Platform

Roles & Responsibilities

  • Experience designing and building robust batch and streaming data pipelines using Azure Data Factory, Databricks, and Delta Lake
  • Hands-on expertise with Azure data governance features such as Microsoft Purview, Unity Catalog, and RBAC for data quality, lineage, and security
  • Proficiency in CI/CD automation and infrastructure as code using Azure DevOps or GitHub Actions, with Bicep or Terraform
  • Strong collaboration skills with data scientists and product owners to translate requirements into scalable data solutions; willingness to engage in continuous learning and training

Requirements:

  • Plan, develop, and maintain robust batch and streaming data pipelines using Azure Data Factory, MS Fabric, and Databricks
  • Integrate new Azure features such as mirroring (zero ETL), Delta Live Tables, and Unity Catalog to enable centralized governance
  • Automate deployments and testing with CI/CD (Azure DevOps or GitHub Actions) and infrastructure as code (Bicep/Terraform)
  • Ensure data quality, lineage, and security with Microsoft Purview and RBAC, and collaborate with data scientists, product owners, and customers to translate requirements into scalable solutions

Job description

We digitize decisions with data—would you like to get involved?

We shape our customers' data-driven future with scalable, secure, and automated Azure platforms. As a Professional Data Platform Engineer, you will be responsible for complex data pipelines: from zero-ETL replication of operational data (mirroring) to Delta Lake-based lakehouses. Our vision: self-service data access for all departments, supported by DataOps and governance.


Tasks & Responsibilities:

  • Planning, development, and maintenance of robust batch and streaming pipelines with Azure Data Factory, MS Fabric and Databricks

  • Integration of new Azure features such as mirroring (zero ETL), Delta Live Tables, and Unity Catalog for centralized governance

  • Automation of deployments and testing using CI/CD (Azure DevOps or GitHub Actions) and infrastructure as code (Bicep/Terraform)

  • Ensuring data quality, lineage, and security—using Microsoft Purview and role-based access control

  • Collaborating with data scientists, product owners, and customers to translate requirements into scalable data solutions

  • Evaluating new services such as Lakeflow or Microsoft Fabric for productive use

What we offer you

  • Flexible working: Trust-based working hours, hybrid working, and remote working possible (residence in Germany)

  • State-of-the-art technology stack: Work with Fabric, Delta Lake, Databricks, Mirroring—ideal for tech-savvy juniors

  • Targeted further training: Working hours for training, certifications, and mentoring

  • Innovation space: Opportunity to test new tools and frameworks and develop proof-of-concepts

  • Diversity & inclusion: We welcome all applicants and promote an inclusive environment; your ideas are important to us

  • Real influence: You can help shape technology decisions and contribute your ideas directly to product roadmaps

If you want to take on responsibility, enjoy working with the latest Azure technology, and value an open, learning-oriented team culture, we look forward to receiving your application!

AI Operations (AI Ops) Engineer Related jobs

Other jobs at paiqo

We help you get seen. Not ignored.

We help you get seen faster — by the right people.

🚀

Auto-Apply

We apply for you — automatically and instantly.

Save time, skip forms, and stay on top of every opportunity. Because you can't get seen if you're not in the race.

AI Match Feedback

Know your real match before you apply.

Get a detailed AI assessment of your profile against each job posting. Because getting seen starts with passing the filters.

Upgrade to Premium. Apply smarter and get noticed.

Upgrade to Premium

Join thousands of professionals who got noticed and hired faster.