Data Engineer II

Work set-up: 
Full Remote
Contract: 
Experience: 
Mid-level (2-5 years)
Work from: 

Offer summary

Qualifications:

Experience with data pipelines in cloud environments, preferably AWS., Proficiency in Python or Java programming., Strong SQL skills for data modeling and optimization., Experience with data orchestration tools like Apache Airflow..

Key responsibilities:

  • Design and implement scalable data pipelines, both streaming and batch.
  • Collaborate with engineers and data analysts to build reliable datasets.
  • Manage and optimize the performance of the SingleStore database.
  • Support the scaling of Ookla's SaaS platform and applications.

MedPage Today logo
MedPage Today http://www.medpagetoday.com/
11 - 50 Employees
See all jobs

Job description

Description
Position at Ookla

Ookla ® is a global leader in connectivity intelligence, offering unparalleled network insights through the combined expertise of Speedtest ® , Downdetector ® , RootMetrics ® , and Ekahau ® . Ookla’s complementary datasets combine crowdsourced and controlled, public and private collection methods, QoS and QoE metrics, and more to unlock correlations and actionable
insights — helping organizations optimize networks, enhance digital experiences, and create better connected experiences for end-users.
 
Our team is a group of people brought together through passion and inspired by possibility. We are looking for team members who love solving problems, are motivated by challenges, and enjoy turning clever ideas into exceptional products. When you work for us, you are using Ookla data and insights to advance our mission of better connectivity for all.
 
We are committed to providing you a flexible work environment where individuality, fun, and
talent are all valued equally. If you consider yourself innovative, adept at collaboration, and you care deeply about the work you do, we want to talk!


Ookla is looking for a Data Engineer to help us scale our SaaS Platform & Applications further to take on more customers and data. Daily you will be working with data pipelines taking our raw data, delivered through APIs, to enhanced datasets that can be used by data analysts and data scientists.



Expectations for Success
  • Participate in the design and help drive the implementation of our data platform
  • Design, implement, and operate streaming and batch pipelines that scale
  • Partner with both engineers and data analysts to build reliable datasets that can be trusted, understood, and used by the rest of the company
  • Help manage our SingleStore database, optimizing performance for queries and tables

Requirements:

 

  • Experience working with data pipelines in at least one cloud, preferably AWS
  • Know and have worked with Python or Java 
  • Comfortable in SQL
  • Experience in creating, and optimising, data models suited for data analysts used in BI tools such as Tableau
  • Embrace a fast-paced start-up environment
  • Have prior professional experience in building streaming and batch pipelines
  • Familiarity with data orchestration (Apache Airflow)
  • Experience with docker/kubernetes
  • Know your way around data warehouse solutions such as BigQuery and/or Amazon Redshift
  • Should be passionate about your job and enjoy a fast paced international working environment
  • Background or experience in the telecom industry is a plus but not a requirement
  • Love automating and enjoy monitoring

Required profile

Experience

Level of experience: Mid-level (2-5 years)
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Collaboration
  • Adaptability
  • Problem Solving

Data Engineer Related jobs