Logo for Deutsche Telekom IT Solutions Slovakia

Cloud Architect - Data (Azure) – Power BI Solution Expert

Roles & Responsibilities

  • 5+ years of hands-on experience in data engineering or data architecture roles, with senior-level expertise in data platform/lakehouse design and a proven track record transforming traditional data warehouses into Lakehouse/Delta Lake architectures.
  • Deep expertise with Azure Lakehouse technologies (Azure Databricks, Azure Data Factory, Azure Data Lake Storage, Event Hub, Azure Synapse, Azure Logic Apps) and strong skills in Delta Lake, Spark, Spark Structured Streaming, SQL, and Python; experience with data modeling approaches (Kimball, Inmon, Data Vault).
  • Strong architectural ownership mindset, ability to define architecture principles and standards, oversee prototype/reference architectures, ensure production readiness, and enforce security and regulatory compliance in scalable IT platforms.
  • Experience enabling self-service BI and analytics (Power BI) development; collaboration with stakeholders in agile environments; and relevant Azure certifications or intent to pursue (e.g., DP-203, DP-900, AZ-900, AI-900).

Requirements:

  • Design and evolve the enterprise Azure Lakehouse architecture to ensure scalability, performance, and long-term sustainability.
  • Lead the transformation of classic Data Warehouse environments into modern Lakehouse/Delta Lake architectures, owning architecture from concept through production rollout and continuous optimization.
  • Define and implement architecture principles, standards, patterns, and best practices for data engineering and analytics platforms; design and oversee data ingestion, processing, and streaming pipelines using Azure Databricks, Azure Data Factory, Event Hub, and Azure Logic Apps; build data processing solutions using Delta Lake, Spark, Structured Streaming, SQL, and Python; define data modeling strategies across lake and warehouse layers.
  • Collaborate with engineering teams to implement reference architectures, prototypes, and proof-of-concepts; enable and coach data engineering teams; ensure platform designs comply with security, data protection, and regulatory requirements; support advanced analytics and self-service BI; work with product owners, governance, and business stakeholders in agile delivery; contribute to continuous improvement of platform architecture and operating models.

Job description

Company Description

Our brand Deutsche Telekom IT Solutions Slovakia entered the life of Košice region in 2006 under the name of T-Systems Slovakia and ever since has been inextricably linked with the region when became one of the founding members of Košice IT Valley. We have managed to grow from scratch to the second largest employer in the eastern part of the country with more than 3900 employees. Our goal is to proactively find new ways to improve and continuously transform into the type of company providing innovative information and communication technology services.

Job Description

Purpose

The Senior Data Analytics & Data Engineering Architect is a key member of the Common Data Intelligence (CDI) Hub, responsible for designing, evolving, and governing T-Systems’ central Azure Lakehouse architecture. The role focuses on transforming traditional data warehouse landscapes into modern, cloud-native data platforms that enable enterprise-scale analytics, advanced data use cases, and governed self-service BI.

 

Team / Project Description

The Common Data Intelligence (CDI) Hub is the central data and analytics organization responsible for defining and operating enterprise-wide data platforms and standards. The team works across business domains and international engineering units to deliver scalable, secure, and future-ready Azure-based analytics solutions.

 

WHAT WILL YOU DO?

  • Design and evolve the enterprise Azure Lakehouse architecture, ensuring scalability, performance, and long-term sustainability.
  • Lead the transformation of classic Data Warehouse environments into modern Lakehouse and Delta Lake-based architectures.
  • Take architectural ownership from concept through implementation, production rollout, and continuous optimization.
  • Define and implement architecture principles, standards, patterns, and best practices for data engineering and analytics platforms.
  • Design and oversee data ingestion, processing, and streaming pipelines using Azure Databricks, Azure Data Factory, Event Hub, and Azure Logic Apps.
  • Build and optimize data processing solutions using Delta Lake, Spark, Spark Structured Streaming, SQL, and Python.
  • Define data modeling strategies across data lake and warehouse layers, applying Kimball, Inmon, and Data Vault methodologies.
  • Collaborate with engineering teams to implement reference architectures, prototypes, and proof-of-concepts.
  • Enable and coach data engineering teams toward autonomous, high-quality solution development.
  • Ensure platform designs comply with enterprise security, data protection, and regulatory requirements.
  • Support advanced analytics, self-service BI, and reporting use cases across the organization.
  • Work closely with product owners, architects, governance, and business stakeholders in agile delivery environments.
  • Contribute to continuous improvement of platform architecture, development processes, and operating models.

Qualifications

YOU WILL SUCCEED IF YOU:

  • have a degree in (Business) Informatics, Business Administration, or equivalent practical experience
  • have senior-level experience in Data Platform / Lakehouse ArchitectureExpert
  • have 5+ years of hands-on experience in data engineering or data architecture roles
  • have proven experience transforming traditional Data Warehouse landscapes into Lakehouse / Delta Lake architectures
  • have strong architectural ownership mindset, covering design, implementation, and production readiness
  • have deep hands-on experience with Azure Lakehouse technologiesAdvanced to Expert
  • have strong expertise in Azure Databricks, Azure Data Factory, Azure Data Lake Storage, Event Hub, Azure Synapse, and Azure Logic Apps
  • have advanced skills in Delta Lake, Spark, Spark Structured Streaming, SQL, and Python
  • have deep understanding of data lake storage architectures and modern data processing paradigms
  • have solid knowledge of relational and multidimensional databases
  • have strong experience with data warehouse modeling approaches (Kimball, Inmon, Data Vault)
  • have experience designing scalable IT platforms, processes, and operating models
  • have the ability to define architecture principles, standards, and best practices at enterprise level
  • have experience implementing prototypes, evaluations, and reference solutions
  • have strong security and compliance awareness, including handling business-critical and personal data in regulated environments
  • are comfortable working in agile environments using Scrum or Kanban
  • have an open, communicative working style and enjoy cross-functional collaboration
  • demonstrate a solution-oriented, analytical, and structured mindset

 

POSSIBLE SPECIALIZATION

  • Experience with Reporting, Analytics, and Dashboarding solutions
  • Hands-on experience with Power BI development
  • Exposure to Microsoft Power Platform (Power Automate, Power Apps)
  • Azure certifications such as:
    • DP-203 (Data Engineering on Microsoft Azure)
    • DP-900 (Azure Data Fundamentals)
    • AZ-900 (Azure Fundamentals)
    • AI-900 (Azure AI Fundamentals)
    • DP-100 / DP-300 / DP-420
  • Experience working in large, international enterprise data & analytics organizations

 

 

Additional Information

WHY SHOULD YOU CHOOSE US?

We believe in balance between work and personal life. An attractive and extensive work-life balance portfolio guarantees lasting motivation for employees and thus a better quality of life, promotes physical and mental well-being and contributes to a positive work environment. All this with the aim of providing more freedom in reconciling work, career growth, private life and individual lifestyle. Therefore we offer to our employees over 25 different benefits to improve their personal and professional life in these areas:

  • Financial benefits
  • Benefits with focus on learning and development
  • Benefits with focus on health and sport
  • Benefits with focus on family and work – life balance
  • Other benefits

For more information about our benefits click to Benefits

Salary

Final salary is negotiable.

We are offering base salary depending on seniority level and previous experience of candidate. In addition to base salary we provide variable part and other financial benefits. Base salary will not be lower than 2 200€ /brutto.

Additional information

* Please be informed that our remote working possibility is only available within Slovakia due to European taxation regulation.

Cloud Architect Related jobs

Other jobs at Deutsche Telekom IT Solutions Slovakia

We help you get seen. Not ignored.

We help you get seen faster — by the right people.

🚀

Auto-Apply

We apply for you — automatically and instantly.

Save time, skip forms, and stay on top of every opportunity. Because you can't get seen if you're not in the race.

AI Match Feedback

Know your real match before you apply.

Get a detailed AI assessment of your profile against each job posting. Because getting seen starts with passing the filters.

Upgrade to Premium. Apply smarter and get noticed.

Upgrade to Premium

Join thousands of professionals who got noticed and hired faster.