Logo for Bonapolia

Middle+/Senior Data Platform Engineer (Snowflake)

Roles & Responsibilities

  • 5+ years of data engineering experience
  • Python for scripting, API development, and pipeline creation
  • Apache Airflow for pipeline orchestration (Dagster or Prefect acceptable)
  • AWS services experience (Glue, Lambda) and deploying/maintaining production workloads

Requirements:

  • Build data ingestion pipelines integrating AI tools and internal platforms into Snowflake
  • Maintain and harden Snowflake infrastructure, standardizing schemas and tables
  • Deploy work through CI/CD pipelines into Airflow or AWS Glue
  • Collaborate with product managers and engineers to identify data needs and manage access requests

Job description


We are looking for a Middle+/Senior Data Platform Engineer (Snowflake):

🗣 Language Proficiency: Upper-Intermediate

🧾 Employment type: Full time

🌍 Candidate Location: Poland

🕐 Working Time Zone: CET

🧭 Planned Work Duration: 12+ months

👥 Customer Description:

Our Client is a leading global management consulting company recognized for delivering high-impact solutions across industries.

The company works with large global enterprises across finance, media, technology, and public sector organizations, providing advanced platforms and consulting services.

🧩 Project Description:

This project is part of a federated data delivery initiative within a secure enterprise technology ecosystem. The focus is on building and maintaining robust data pipelines that collect and process data from multiple enterprise systems and cloud platforms.

The objective is to enable leadership to gain actionable insights aligned with strategic goals and to support product and service teams in targeting appropriate user groups while measuring the effectiveness of AI-driven productivity initiatives.

⚙️ Project Phase: ongoing

👨‍💻Project Team: Program Manager, 2 Product Managers, 2 Engineers, User Researcher, Design professional, Analytics Lead

🤝 Soft Skills:

• Highly proactive with the ability to independently identify stakeholders and drive tasks to completion

• Strong stakeholder management skills with the ability to engage diverse roles across technical and product teams

• Curious mindset with a focus on continuous improvement and challenging existing processes

• Excellent communication skills for effective collaboration with cross-functional teams

• Strong time management with a high level of organization and reliability

💡 Hard Skills / Must Have:

• 5+ years of data engineering experience

• Python for scripting, API development, and pipeline creation

• Apache Airflow — for pipeline orchestration; Dagster or Prefect accepted as alternatives

• AWS services — especially Glue, Lambda; experience deploying and maintaining production workloads

• Apache Spark — for distributed processing, particularly within AWS Glue

• Snowflake — preferred data warehouse; Redshift or BigQuery accepted if concepts transfer cleanly

• CI/CD pipelines — GitHub Actions or similar; this is how pipelines and scripts are deployed to Airflow and Glue

• API experience — consuming third-party APIs and building internal APIs with Python)

• Git / GitHub — version control, branching strategy, pull request workflow

• PostgreSQL or other OLTP databases — for operational data access and integration

✨ Hard Skills / Nice to Have:

• Snowflake Cortex — increasingly used within the team

• Scala for distributed data processing tasks

• Agentic frameworks — LangChain, Pydantic ecosystem, or similar

• Snowflake access and role management — RBAC, column-level security (ABAC)

📌 Responsibilities and Tasks:

• Build data ingestion pipelines integrating AI tools and internal platforms into Snowflake

• Maintain and harden the existing Snowflake infrastructure — schemas and tables that grew organically without data engineering input — and bring them up to standard

• Deploy work through CI/CD pipelines into Airflow or AWS Glue

• Manage and process access requests

• Collaborate proactively with product managers and engineers to identify data needs

🧪 Technology Stack: Python, Snowflake, Apache Airflow, Apache Spark, Scala , PostgreSQL, AWS

📩 Ready to Join?
We look forward to receiving your application and welcoming you to our team!

Data Engineer Related jobs

Other jobs at Bonapolia

We help you get seen. Not ignored.

We help you get seen faster — by the right people.

🚀

Auto-Apply

We apply for you — automatically and instantly.

Save time, skip forms, and stay on top of every opportunity. Because you can't get seen if you're not in the race.

AI Match Feedback

Know your real match before you apply.

Get a detailed AI assessment of your profile against each job posting. Because getting seen starts with passing the filters.

Upgrade to Premium. Apply smarter and get noticed.

Upgrade to Premium

Join thousands of professionals who got noticed and hired faster.