Logo for Bonapolia

Senior Python Developer

Roles & Responsibilities

  • Expert proficiency in Python 3.7+ for data pipeline development, including ETL/ELT pipelines with Airflow 2
  • Advanced SQL skills
  • Deep understanding of multi-layered data warehouse architectures (Raw, Staging, Data Marts)
  • Russian language proficiency (mandatory)

Requirements:

  • Design, develop, and maintain end-to-end data pipelines from diverse source systems to the data warehouse
  • Build scalable and cost-efficient infrastructure for storing and processing large volumes of data
  • Implement and improve monitoring, alerting, and automated data quality validation systems
  • Contribute to system design and data architecture evolution with a focus on performance, resilience, and security

Job description

Senior Data Engineers (Python)
Tech Level: Senior
 Language Proficiency: Intermediate

Location: Europe
• Employment type: Full time
• Working Time Zone: CET
• Start: ASAP
• Planned Work Duration: 3+ months

Customer Description:
The customer is a global mobility and urban services platform that enables users to access transportation and on-demand services through a digital marketplace.
The platform supports ride booking, intercity travel, delivery, and task-based services, focusing on flexibility, transparency, and direct interaction between users and service providers.

 Project Description:
The project focuses on building and evolving a large-scale data ecosystem that supports analytics and core business operations.
The team is responsible for developing reliable, high-performance data pipelines and scalable data infrastructure to ensure timely and trustworthy data delivery across the organization.

 Project Phase: Ongoing
Project Team: On the client’s side

 Soft Skills:

• Strong self-management and ability to work independently while meeting deadlines
• Proactive, ownership-driven mindset with a results-oriented approach
• Clear and effective communication skills, including collaboration within distributed teams
• Collaborative attitude, openness to feedback, and focus on shared goals
• Strong problem-solving abilities, particularly in ambiguous or evolving contexts
• High attention to detail and commitment to code quality
• Adaptability to changing priorities and new technologies

Hard Skills / Must Have:
• Expert proficiency in Python 3.7+ for data pipeline development
• Advanced SQL skills
• Deep understanding of multi-layered data warehouse architectures (Raw, Staging, Data Marts)
• Extensive experience building ETL/ELT pipelines using Airflow 2
• Strong knowledge of object-oriented programming, design patterns, and clean architecture principles
• Professional experience with cloud-based object storage services
• Hands-on experience with cloud data warehouses
• Mandatory experience with Git for version control and collaborative development
• Russian language is a must

Hard Skills / Nice to Have:
• Experience with high-load or low-latency data systems and message brokers
• Practical knowledge of DevOps practices and CI/CD automation
• Experience with Infrastructure as Code, particularly Terraform
• Understanding of containerization and orchestration technologies such as Docker and Kubernetes
• Experience participating in testing, acceptance of developed functionality, and investigation of data quality issues

 Responsibilities and Tasks:

• Design, develop, and maintain end-to-end data pipelines from diverse source systems to the data warehouse
• Build scalable and cost-efficient infrastructure for storing and processing large volumes of data
• Implement and improve monitoring, alerting, and automated data quality validation systems
• Contribute to system design and data architecture evolution with a focus on performance, resilience, and security
• Optimize existing data pipelines and warehouse queries for efficiency and cost reduction
• Collaborate closely with data analysts to clarify requirements and transformation logic
• Perform root cause analysis of data quality incidents and implement long-term preventive solutions
• Create and maintain technical documentation covering schemas, pipeline architecture, and data lineage

 Technology Stack: Python, SQL, Spark, Apache Beam, Cloud platforms, Apache Kafka, Kubernetes, Managed messaging services, Debezium, Airflow 2, Terraform, GitHub, CI/CD tools, Jira



📩 Ready to Join?
We look forward to receiving your application and welcoming you to our team!

Python Developer Related jobs

Other jobs at Bonapolia

We help you get seen. Not ignored.

We help you get seen faster — by the right people.

🚀

Auto-Apply

We apply for you — automatically and instantly.

Save time, skip forms, and stay on top of every opportunity. Because you can't get seen if you're not in the race.

AI Match Feedback

Know your real match before you apply.

Get a detailed AI assessment of your profile against each job posting. Because getting seen starts with passing the filters.

Upgrade to Premium. Apply smarter and get noticed.

Upgrade to Premium

Join thousands of professionals who got noticed and hired faster.