Match score not available

Senior Data Engineer (TV Platform)

72% Flex
Remote: 
Full Remote
Contract: 
Salary: 
8 - 8K yearly
Experience: 
Senior (5-10 years)
Work from: 

Offer summary

Qualifications:

5+ years experience in Software Development/Big Data, Proficient in Python, PySpark, Spark, ETL frameworks, AWS.

Key responsabilities:

  • Build and maintain ETL pipelines
  • Modify or create new application components
  • Develop databases, data collection & analytics systems
  • Conduct code and design reviews
  • Collaborate with team to prioritize business needs
Sigma Software Group logo
Sigma Software Group Large https://www.sigma.software
1001 - 5000 Employees
See more Sigma Software Group offers

Job description

Logo Jobgether

Your missions

Company Description

Sigma Software is looking for an experienced Senior Data Engineer to join our growing engineering team. This opportunity is for you if you want to work with a tightly knit team of data engineers solving challenging problems using cutting-edge data collection, transformation, analysis, and monitoring tools in the cloud. 

Sounds like you?  
We are waiting for you on our team! 

CUSTOMER
Our client has superior end-to-end technology, a premium marketplace, and best-in-market advisory services that power the world’s advertising businesses of the largest media and entertainment companies. For example, Fox, NBC Universal, Viacom in the USA, Sky, Channel 4, RTE, and Mediaset in Europe.

PROJECT
We invite you to work with the provider of comprehensive ad platforms for publishers, advertisers, and media buyers. We build and support high-quality data solutions to process terabytes of data on AWS’s cloud-native data platform. 

Job Description
  • Build and maintain ETL pipelines 
  • Modify existing application code or interfaces or build new application components 
  • Analyze requirements, support design, code, test, debug, deploy, and maintain programs and interfaces. Documentation of the work is essential 
  • Develop and implement databases, data collection systems, data analytics, and other strategies that optimize statistical efficiency and quality 
  • Conduct code and design reviews to ensure the high quality of the product  
  • Mentor and guide colleagues and new team members 
  • Collaborate with Data engineers and Product Managers to prioritize business needs and translate complex product requirements into working high-quality cloud-native data solutions 
  • Share knowledge with wider engineering teams by doing technical demos 

Qualifications
  • 5+ years of hands-on experience in the Software Development field and/or Big Data 
  • Excellent knowledge of Python 
  • Proficient in PySpark and a strong understanding of Spark 
  • Understanding of ETL frameworks such as DBT 
  • Experience with orchestration tools such as Airflow 
  • Knowledge of Lakehouse architecture on top of AWS such as Apache Iceberg, Hudi, or Delta Lake 
  • Strong understanding of AWS (IAM, S3, Security groups) 
  • Proficient in Infra as code (Terraform or similar) 
  • Great communication skills: Able to articulate clearly about status, blockers, and design 
  • Ability to work independently and collaboratively  
  • At least an Upper-Intermediate level of English 

WILL BE A PLUS

  • Experience dealing with modern aspects of Lakehouse such as Databricks Unity Catalog and Snowflake Iceberg integration 

Required profile

Experience

Level of experience: Senior (5-10 years)
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Soft Skills

  • ability-to-meet-deadlines
  • verbal-communication-skills

Data Engineer Related jobs