Logo for Scicom Infrastructure Services

Big Data Technical Project Manager

Roles & Responsibilities

  • 5+ years of experience in technical project management leading large-scale data or analytics initiatives
  • Strong understanding of big data platforms (Databricks, Snowflake, Hadoop/Spark) and data pipelines (ETL/ELT) as well as data warehousing/lakehouse architectures
  • Experience with cloud environments (AWS, Azure, or GCP) and the ability to manage multiple workstreams in complex environments
  • Excellent communication, stakeholder management, and leadership skills, with the ability to translate business needs into technical requirements

Requirements:

  • Lead end-to-end delivery of big data and analytics projects, including platform implementations, migrations, and optimizations, with scope, timelines, milestones, and risk management; provide regular status updates to stakeholders and leadership
  • Provide technical oversight for data platforms (Databricks, Snowflake) and collaborate with engineers to design scalable pipelines and architectures, overseeing ingestion, ETL/ELT, storage, and governance
  • Coordinate across data engineering, analytics, infrastructure, and business teams; translate business needs into technical requirements and drive cross-functional execution; lead meetings and sprint planning; manage third-party vendors as needed
  • Establish project governance, ensure compliance with data security, privacy, and regulatory requirements, and drive continuous improvement in data delivery processes; support Agile/Waterfall or hybrid methodologies

Job description

Role Overview


We are seeking a Technical Project Manager with deep expertise in big data platforms to lead and deliver enterprise-scale data and analytics initiatives. This role requires a strong combination of technical fluency and project management leadership, with hands-on understanding of platforms such as Databricks, Snowflake, or similar modern data technologies.

The ideal candidate will be responsible for driving complex data projects from concept through execution, ensuring alignment across engineering, analytics, and business teams while maintaining high standards of performance, scalability, and delivery.


Key Responsibilities


Project Leadership

  • Lead end-to-end delivery of big data and analytics projects, including platform implementations, migrations, and optimizations
  • Define project scope, timelines, milestones, and deliverables
  • Develop and manage detailed project plans, ensuring on-time and within-scope delivery
  • Identify risks, dependencies, and constraints, and proactively drive mitigation strategies
  • Provide regular status updates to stakeholders and executive leadership

Technical Oversight (Big Data Platforms)

  • Serve as the technical liaison for data platform initiatives involving Databricks, Snowflake, or similar technologies
  • Collaborate with data engineers and architects to design scalable data pipelines and architectures
  • Oversee data ingestion, transformation (ETL/ELT), and storage strategies
  • Ensure adherence to best practices for performance optimization, cost management, and data governance
  • Validate technical solutions against business and functional requirements

Cross-Functional Coordination

  • Coordinate across data engineering, analytics, infrastructure, and business teams
  • Translate business needs into technical requirements and execution plans
  • Facilitate project meetings, sprint planning, and technical discussions
  • Manage third-party vendors and consulting partners as needed

Governance & Delivery Excellence

  • Establish and maintain project governance, standards, and documentation
  • Ensure compliance with data security, privacy, and regulatory requirements
  • Support Agile, Waterfall, or hybrid delivery methodologies
  • Drive continuous improvement across data delivery processes

Required Qualifications

  • 5+ years of experience in technical project management
  • Strong understanding of big data platforms (REQUIRED) such as:
    • Databricks
    • Snowflake
    • Hadoop, Spark, or similar ecosystems
  • Experience managing large-scale data or analytics projects
  • Solid understanding of:
    • Data pipelines (ETL/ELT)
    • Data warehousing and lakehouse architectures
    • Cloud environments (AWS, Azure, or GCP)
  • Ability to manage multiple workstreams and priorities in complex environments
  • Excellent communication, stakeholder management, and leadership skills

Preferred Qualifications

  • Experience with Delta Lake, data lakehouse architecture, or real-time streaming
  • Familiarity with orchestration tools (Airflow, Azure Data Factory, etc.)
  • Knowledge of data governance, data quality, and metadata management frameworks
  • Certifications such as PMP, Scrum Master, Databricks, or Snowflake certifications
  • Experience in enterprise or regulated environments

What Success Looks Like

  • Data initiatives are delivered on time, within scope, and with measurable business impact
  • Data platforms are scalable, efficient, and aligned with enterprise strategy
  • Stakeholders are engaged, informed, and confident in delivery progress
  • Risks and dependencies are proactively managed

Technical Project Manager Related jobs

Other jobs at Scicom Infrastructure Services

We help you get seen. Not ignored.

We help you get seen faster — by the right people.

🚀

Auto-Apply

We apply for you — automatically and instantly.

Save time, skip forms, and stay on top of every opportunity. Because you can't get seen if you're not in the race.

✨

AI Match Feedback

Know your real match before you apply.

Get a detailed AI assessment of your profile against each job posting. Because getting seen starts with passing the filters.

Upgrade to Premium. Apply smarter and get noticed.

Upgrade to Premium

Join thousands of professionals who got noticed and hired faster.