Logo for Deutsche Postbank Group

Senior Data Engineer, AVP

Roles & Responsibilities

  • Hands-on data engineering with Java/Scala/Kotlin in Apache Spark, Dataflow/Apache Beam, or Apache Flink
  • Proficiency in Python (PySpark or Dataflow/Apache Beam) and SQL-based tooling (DBT) with Google BigQuery or similar data warehouses
  • Experience building and maintaining DevOps pipelines in CI/CD tools (e.g., Jenkins, TeamCity, GitHub Actions) and strong software design/architecture knowledge (reliability, scalability, observability)
  • Experience operating in a secure, enterprise hybrid cloud environment within a large regulated organization and collaborating with globally distributed teams

Requirements:

  • Design, build, and maintain scalable and reliable PySpark/DBT/BigQuery data pipelines on Google Cloud Platform to process high-volume transaction data for regulatory and internal compliance monitoring
  • Implement robust data quality frameworks and monitoring to ensure data accuracy, completeness, and timeliness in critical transaction monitoring systems
  • Contribute to DevOps capabilities to maximize automation of applications
  • Collaborate across TDI areas (Cloud Platform, Security, Data, Risk & Compliance) to create optimal solutions, increase reuse, establish best practices, and share knowledge

Job description

Job Description:

Job Title: Senior Data Engineer, AVP

Location: Pune, India

Role Description

Our Technology, Data and Innovation (TDI) strategy is focused on strengthening engineering expertise, introducing an agile delivery model, as well as modernising the bank's IT infrastructure with long-term investments and taking advantage of cloud computing. 

  

You will be working in the Transaction Monitoring and Data Controls team designing, implementing, and operationalising Java components.

What we’ll offer you

As part of our flexible scheme, here are just some of the benefits that you’ll enjoy

  • Best in class leave policy
  • Gender neutral parental leaves
  • 100% reimbursement under childcare assistance benefit (gender neutral)
  • Sponsorship for Industry relevant certifications and education
  • Employee Assistance Program for you and your family members
  • Comprehensive Hospitalization Insurance for you and your dependents
  • Accident and Term life Insurance
  • Complementary Health screening for 35 yrs. and above

Your key responsibilities

  • Design, build, and maintain scalable and reliable PySpark/DBT/BigQuery data pipelines, pre-dominantly on Google Cloud Platform (GCP) to process high-volume transaction data for regulatory and internal compliance monitoring.
  • Implement robust data quality frameworks and monitoring solutions to ensure the accuracy, completeness, and timeliness of data within our critical transaction monitoring systems.
  • Contributing to DevOps capabilities to ensure maximum automation of our applications
  • Collaboration across the TDI areas such as Cloud Platform, Security, Data, Risk & Compliance areas to create optimum solutions for the business, increasing re-use, creating best practice, and sharing knowledge

Your skills and experience

  • Expert hands-on Data Engineering using at least one of:
    • Java/Scala/Kotlin in a toolset such as Apache Spark, Dataflow/Apache-Beam, Apache Flink
    • Python in a toolset such as PySpark or Dataflow/Apache-Beam
    • SQL based using DBT
  • Professional experience of at least one data warehousing technology (ideally Google Big Query), including knowledge of partitioning, clustering, and cost/performance optimization strategies.
  • Hands on experience writing and maintaining DevOps pipelines in at least one "CI/CD" tool such as Team City, Jenkins, GitHub Actions.
  • Experience contributing to software design and architecture including consideration of meeting non-functional requirements (e.g., reliability, scalability, observability, testability) and understanding of relevant Architecture styles and their trade-offs - e.g., Data Warehouse, ETL, ELT, Monolith, Batch, Incremental loading vs Stateless processing
  • Experience navigating and engineering within a secure, enterprise hybrid cloud environment within a large, regulated, and complex technology landscape
  • Experience of working with a globally distributed team requiring remote interaction across locations, time zones and diverse cultures and excellent communication skills (verbal and written)

How we’ll support you

  • Training and development to help you excel in your career
  • Coaching and support from experts in your team
  • A culture of continuous learning to aid progression
  • A range of flexible benefits that you can tailor to suit your needs

About us and our teams

Please visit our company website for further information:

https://www.db.com/company/company.html

We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively.

Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group.

We welcome applications from all people and promote a positive, fair and inclusive work environment.

Data Engineer Related jobs

Other jobs at Deutsche Postbank Group

We help you get seen. Not ignored.

We help you get seen faster — by the right people.

🚀

Auto-Apply

We apply for you — automatically and instantly.

Save time, skip forms, and stay on top of every opportunity. Because you can't get seen if you're not in the race.

AI Match Feedback

Know your real match before you apply.

Get a detailed AI assessment of your profile against each job posting. Because getting seen starts with passing the filters.

Upgrade to Premium. Apply smarter and get noticed.

Upgrade to Premium

Join thousands of professionals who got noticed and hired faster.