Big Data Lead ( Airflow + Python)

Work set-up: 
Full Remote
Contract: 
Experience: 
Senior (5-10 years)
Work from: 
Mexico

Offer summary

Qualifications:

At least 8 years of experience in data engineering or similar roles., Proven expertise in Apache Airflow, Python, AWS, and Snowflake., Strong SQL skills, including performance tuning and data modeling., Excellent communication skills, with client-facing experience..

Key responsibilities:

  • Design, build, and maintain ELT pipelines using Apache Airflow and Snowflake in AWS.
  • Write modular Python code for scalable data workflows.
  • Deploy container-based services in AWS and set up monitoring.
  • Collaborate with clients and internal teams to ensure clear communication and delivery.

Sequoia Global Services logo
Sequoia Global Services Startup http://www.sequoia-connect.com
11 - 50 Employees
See all jobs

Job description

Description

Our client is a rapidly growing, automation-led service provider specializing in IT, business process outsourcing (BPO), and consulting services. With a strong focus on digital transformation, cloud solutions, and AI-driven automation, they help businesses optimize operations and enhance customer experiences. Backed by a global workforce of over 32,000 employees, our client fosters a culture of innovation, collaboration, and continuous learning, making it an exciting environment for professionals looking to advance their careers.

Committed to excellence, our client serves 31 Fortune 500 companies across industries such as financial services, healthcare, and manufacturing. Their approach is driven by the Automate Everything, Cloudify Everything, and Transform Customer Experiences strategy, ensuring they stay ahead in an evolving digital landscape. 

As a company that values growth and professional development, our client offers global career opportunities, a dynamic work environment, and exposure to high-impact projects. With 54 offices worldwide and a presence in 39 delivery centers across 28 countries, employees benefit from an international network of expertise and innovation. Their commitment to a 'customer success, first and always' philosophy ensures a rewarding and forward-thinking workplace for driven professionals.

We are currently searching for a Big Data Lead (Airflow + Python):

Responsibilities:

  • Design, build, and maintain ELT pipelines using Apache Airflow and Snowflake in an AWS environment.
  • Write modular Python code to support scalable and maintainable data workflows.
  • Deploy container-based services in AWS, including monitoring setup.
  • Collaborate with clients and internal stakeholders, ensuring clear communication and delivery alignment.
  • Serve as an individual contributor, fully hands-on with the mentioned technologies.

Requirements:

  • 8+ years of experience as a Data Engineer or similar role.
  • Proven expertise in Apache Airflow, Python, AWS, and Snowflake.
  • Strong SQL skills, including performance tuning and data modeling.
  • Hands-on experience with containerization and deployment in AWS.
  • Excellent communication skills; client-facing experience required.

Languages:

  • Advanced Oral English.
  • Native Spanish.

Notes:

  • Fully remote.


If you meet these qualifications and are pursuing new challenges, start your application on our website to join an award-winning employer. Explore all our job openings | Sequoia Career’s Page: https://www.sequoia-connect.com/careers/


Required profile

Experience

Level of experience: Senior (5-10 years)
Spoken language(s):
EnglishSpanish
Check out the description to know which languages are mandatory.

Other Skills

  • Communication

Related jobs