Who we are…and what we do
LIPTON Teas and Infusions is the biggest tea business in the world, with world class purpose driven brands such as Lipton, PG tips and Pukka. As Lipton T&I we are united in one purpose: growing a world of wellbeing through the regenerative power of plants.
In July 2022, CVC Capital Partners Fund VIII took over the full ownership of the LIPTON Teas and Infusions (formally ekaterra) business from its previous owner, Unilever. As a standalone entity with a dedicated single-category focus, Lipton T&I is even better positioned to lead the tea industry, delivering higher growth and value, and a greater impact on the wider world. With 11 production factories in four continents and tea growing estates in three countries, LIPTON Teas and Infusions is a profitable and growing business whose brands reach hundreds of millions of consumers. It has a presence in over 100 countries.
At LIPTON Teas and Infusions, we work alongside people who put consumer love at the heart of every decision. Diverse minds who celebrate new ideas, share our values and the commitment we have for the wellbeing of all. In return, we create an environment that gives our people space and freedom, where they can grow as leaders. A connected community where ideas can thrive. Where you explore new challenges. Learning all you need to master your field, and even more about yourself.
Be part of this amazing blend. Join our collective and help us grow a better world of wellbeing and a better you.
Job Title: Senior Data Engineer
Job Status: Full-time, Permanent
Work location: Remote, Poland
Travel: Limited
Overview
As a Senior Data Engineer at LIPTON Teas and Infusions, you will play a pivotal role in shaping and maintaining our advanced data infrastructure. Your expertise in ETL processes, data ingestion, and processing will ensure the reliability and robustness of our data foundation. You will collaborate closely with BI developers, data scientists, product owners, and senior stakeholders to create impactful data & analytics solutions. Additionally, you will implement best practices to enhance data quality and manage metadata, while also leveraging software engineering principles such as version control and CI/CD to drive continuous improvement. Your innovative approach and attention to cost management in our cloud environment will support our mission of data excellence.
Key Responsibilities:
- Leading the development of data infrastructure to extract, transform, and load (ETL/ELT) data.
- Supporting BI developers and data scientists to build data & analytics solutions.
- Working with product owners and (senior) stakeholders to clarify business requirements.
- Responsible for the ingestion and processing enterprise-wide data through our medallion architecture.
- Closely collaborate with Data Engineering Lead to build and enhance the data engineering framework.
- Managing and optimizing periodic data refreshes through data pipelines (scheduled jobs).
- Designing and implementing data management practices to improve data quality and meta data.
- Leveraging software engineering best practices such as version control (Git) and CI/CD (DevOps).
- Continuously strengthening our data foundation through experimentation and innovation.
- Monitoring the cost associated with the cloud environment, data processing, and data computation.
Skills and Experience:
- 5+ years of experience in data engineering or a similar role.
- At minimum a bachelor’s degree in computer science, information technology, or a related field.
- In-depth experience with the Microsoft Azure Platform (Data Factory, ADLS, etc.).
- Advanced knowledge of Azure Databricks and Delta Lake.
- Knowledgeable in the medallion architecture and other best practices.
- Experience with relational databases (e.g., SQL Server).
- Advanced in coding (e.g., PySpark, and SQL).
- Experience with version control (Git) and Unit Testing.
- Ability to translate business requirements to code.
- Proficient in articulating technical solutions to non-technical stakeholders.
- Experience with agile teams and Scrum.