Match score not available

Senior GCP Data Engineer at Vigil

Remote: 
Full Remote
Contract: 
Experience: 
Senior (5-10 years)
Work from: 

Offer summary

Qualifications:

Deep understanding of GCP services, Experience with big data technologies, Strong analytical and problem-solving skills, Experience in Agile development environments.

Key responsabilities:

  • Design and build data pipelines on GCP
  • Collaborate with product management teams
Vigil logo
Vigil Scaleup https://www.vigil.global/
51 - 200 Employees
See more Vigil offers

Job description

SUMMARY:

As a Senior Data Engineer, you will be responsible for designing, building, and maintaining efficient and reliable data pipelines on the Google Cloud Platform (GCP). This role requires a strong background in GCP services and a proven track record of creating effective data solutions that align with business requirements.

We are looking for candidates who are as excited about pushing their own development as they are about advancing our technology stack.

Our core developers are passionate about software engineering and enjoy developing their skills and abilities in a friendly and supportive environment of keen learners.

WHAT WILL YOU BE DOING:

You will join our engineering team and be a valued member working closely in a collaborative, autonomous, cross-functional team. You will help with the following:

  • Engineer robust data pipelines for extracting, transforming, and loading (ETL) data into GCP (Google Cloud Platform)
  • Design and implement scalable and efficient data models on GCP (Google Cloud Platform)
  • Develop and maintain data architecture, ensuring optimal performance and reliability
  • Utilize GCP's big data services such as BigQuery, Dataflow, and Dataprep for large-scale data processing
  • Implement and maintain data security measures following industry best practices
  • Collaborating with product management teams to understand their requirements
  • Manage and maintain changes to tracking specifications based on product
    and feature teams requirements
  • Communicate your needs clearly and responsibly
WHAT WE ARE LOOKING FOR:
  • Deep understanding of GCP as a platform, its capabilities and services
  • Experience with Dataflow, BigQuery, Pub/Sub, Cloud Functions, and Memorystore / Redis
  • Strong experience with database management and optimization, including experience with SQL and NoSQL databases. This includes an in-depth understanding of data modelling, storage, and efficient querying techniques
  • Strong analytical and problem-solving skills to resolve complex technical issues.
  • Strong experience with CI/CD pipelines
  • Experience with Java, Python
  • Experience working in Agile development environments, particularly in a Scrum framework.
  • Strong English communication skills, both written and verbal
AWESOME BUT NOT REQUIRED:
  • Terraform
  • Kafka
WHAT’S IN IT FOR YOU?
  • Be part of our collegial environment where responsibility and authority are shared equally amongst colleagues and help create our company culture
  • A culture in which we don’t criticise failure but ensure we learn from our mistakes
  • An Agile environment where your ideas are welcome
  • The possibility to grow and experience different projects
  • Ongoing Training & Mentoring
  • The possibility to travel

- ATTENTION! THIS POSITION IS FOR PORTUGAL OR BRAZIL BASED ONLY

Required profile

Experience

Level of experience: Senior (5-10 years)
Industry :
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Communication
  • Problem Solving

Data Engineer Related jobs