Match score not available

Senior/Lead Data Software Engineer

extra holidays - extra parental leave
Remote: 
Full Remote
Contract: 
Experience: 
Senior (5-10 years)
Work from: 

Offer summary

Qualifications:

Over 6 years of software/data engineering experience, Experience in Python/Java with Cloud Development, ETL pipelines, Apache Spark, Python automation libraries, API design and development, RDS, DynamoDB, MySQL, and Microservice development.

Key responsabilities:

  • Design applications and write code
  • Collaborate with team
Leo Burnett logo
Leo Burnett Marketing & Advertising Large https://leoburnett.com/
5001 - 10000 Employees
See more Leo Burnett offers

Job description

Company Description

At Publicis Groupe, we are looking for a Senior Data Engineer fluent in English to join Publicis Global Delivery, the outstanding platform that we created to become a global interconnected network and provide offshore & nearshore solutions for our partners - sister companies´ business worldwide.

We are a never sleeps machine of creation that continuously grows and mutates to become a more efficient and collaborative system. A cross-media transformation agent, based in Argentina, Colombia, Costa Rica and Mexico, that provides centralized expertise of all Publicis Global Services' capabilities to enable consistent and standardized delivery across Media, Production, Commerce, Content, Data & Technology.

Job Description

By joining our team you´ll have the amazing opportunity to join the “Epsilon People Cloud” project, the Big Data and IA core platform of Publicis Groupe. You´ll be collaborating on the evolutive development of this platform, having the unique opportunity to take part of a big data streaming project and handle a great variety of cutting-edge technologies in a flexible, fast-paced and multicultural environment that is always growing and mutating.

Day-to-day, your role includes:

The Principal Data/Software Engineer is responsible for designing applications and writing code, performing code reviews, developing technical documentation and is a key contributor in their team and project. A PSE is self-driven, makes sure that the applications and the tech team are working as expected, collaborates closely with the architects and product managers, ensures that the team maintains high levels of performance and builds and strengthens the relationship with the client.

The PSE will give presentations to clients, facilitating collaboration with other groups such as Business and Technology throughout all phases of a project’s lifecycle. This PSE should aim to become a subject matter expert in cloud technologies.

What you’ll do:

  • Write Python/Java micro services for data processing using good coding practices
  • Design and build Big Data ETL code and pipelines using, Python, Spark and AWS
  • Integrate products from data projects into APIs
  • Architect, design and maintain data pipelines through the lifecycle of the product
  • Optimize and monitor existing data pipelines and services using cloud infrastructure
  • Understand and manage massive data-stores
  • Manage batch and real time streaming applications
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Collaborate with the architects to raise up the technical ability of the team in terms of software and data engineering
  • Work with stakeholders to define the road map and the tasks for the team
  • Investigate, procure and ramp up to new technologies
  • Work in teams and collaborate with others to clarify requirements and with trainings, code reviews and peer programming.
  • Maintain concise and clear documentation and best practices on projects
  • Provide proactive feedback on policies and procedures when an opportunity for improvement exists
  • Enjoy being challenged and solve complex problems on a daily basis

Qualifications
  • Over 6 years´ experience software/data engineering experience
  • Applied experience in Python/Java with Cloud Development (preferably in AWS)
  • Applied experience in building ETL pipelines and Apache Spark
  • Applied experience in Python automation libraries and test scripts
  • Applied experience in API Design and development
  • Applied experience in RDS, DynamoDB, MySQL
  • Experience with Automation frameworks preferred
  • Experience in Microservice development
  • Familiarity with IAM roles/policies, AWS EC2, AWS SDK, AWS EMR, API Gateway, Docker
  • Applied experience in client side MVC framework (Model, View, Controller)
  • Bonus Points - Familiarity with Databricks APIs, Terraform and Airflow

Additional Information

Benefits

  • Access to Prepaid Medical Plan
  • Flexible schedule
  • 100% Remote work
  • 14 business days of vacation
  • English lessons
  • Discounts on courses, trainings and universities
  • Access to E-Learning platforms
  • Technical trainings & soft skills development
  • Certification programs
  • Level up program
  • Engagement activities and events
  • A mentor who´ll do a coaching process with you to develop your professional career!

Required profile

Experience

Level of experience: Senior (5-10 years)
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Verbal Communication Skills
  • Open Mindset
  • Leadership

Lead Developer Related jobs