Match score not available

Solutions Architect - BigQuery/GCP

unlimited holidays - extra holidays - extra parental leave - long remote period allowed
Remote: 
Full Remote
Contract: 
Experience: 
Expert & Leadership (>10 years)
Work from: 

Offer summary

Qualifications:

10+ years as a Data Engineer, Proficient in SQL and complex queries, Experience with GCP services and tools, Deep understanding of data warehousing, Familiarity with scripting languages.

Key responsabilities:

  • Design and build ETL/ELT data pipelines
  • Develop and optimize data models and schemas
  • Implement performance tuning and optimization
  • Collaborate with team on data solutions
  • Establish data quality and governance frameworks
Lumenalta (formerly Clevertech) logo
Lumenalta (formerly Clevertech) SME https://lumenalta.com/
501 - 1000 Employees
See more Lumenalta (formerly Clevertech) offers

Job description

Role Overview:

As a Data Engineer, you will design, develop, and optimize data pipelines and infrastructure on GCP to enable advanced analytics and reporting solutions. You will work closely with business stakeholders to deliver robust, scalable solutions that support business intelligence and machine learning initiatives.


This hands-on role requires a deep understanding of BigQuery, data engineering best practices, and the ability to translate business requirements into technical solutions. If you are passionate about working with big data and cloud technologies, we would love to hear from you!


Key Responsibilities:

  • Data Pipeline Development: Design and build ETL/ELT data pipelines using BigQuery and other GCP services to ingest, process, and transform large datasets from multiple sources.
  • Data Modeling & Architecture: Develop and optimize data models and schemas to support analytics, reporting, and machine learning requirements.
  • Performance Optimization: Implement best practices for performance tuning, partitioning, and clustering to optimize data queries and reduce costs in BigQuery.
  • Data Integration & Transformation: Collaborate with data scientists and analysts to design data solutions that integrate seamlessly with BI tools, machine learning models, and third-party applications.
  • Data Quality & Governance: Establish and enforce data quality standards, data governance frameworks, and security policies for data storage and access on GCP.
  • Automation & Monitoring: Automate workflows using Cloud Composer, Cloud Functions, or other orchestration tools to ensure reliable and scalable data pipelines.
  • Documentation & Knowledge Sharing: Create comprehensive documentation for data pipelines, workflows, and processes. Share best practices and mentor junior data engineers.

Required Qualifications:

  • 10+ years of experience working as a Data Engineer, with a focus on GCP and BigQuery.
  • Strong proficiency in SQL and experience in developing complex queries, stored procedures, and views in BigQuery.
  • Hands-on experience with GCP services such as Cloud Storage, Dataflow, Cloud Composer, and Cloud Functions.
  • Deep understanding of data warehousing concepts, dimensional modeling, and building data marts.
  • Experience with ETL/ELT tools like Apache Beam, Dataflow, or dbt.
  • Familiarity with scripting languages like Bash, Python or JavaScript for automation and integration.
  • Proven track record managing teams and projects
  • Ability to design and implement scalable, reliable, and cost-effective BigQuery solutions tailored to specific business use cases.
  • Experience creating reference architectures, data pipelines, and best practice guides.
  • Ability to translate complex technical concepts into business terms.
  • Proven experience working with both technical and non-technical stakeholders to define requirements and deliver high-quality solutions.
  • Excellent communication and interpersonal skills, with the ability to collaborate effectively with cross-functional teams.
  • GCP Professional Data Engineer Certification is a plus.

Preferred Skills:

  • Experience with machine learning on GCP using Vertex AI or AI Platform.
  • Knowledge of data governance and security best practices in a cloud environment.
  • Experience working with real-time streaming data and tools like Pub/Sub or Kafka.

Key Responsibilities:

  • Data Pipeline Development: Design and build ETL/ELT data pipelines using BigQuery and other GCP services to ingest, process, and transform large datasets from multiple sources.
  • Data Modeling & Architecture: Develop and optimize data models and schemas to support analytics, reporting, and machine learning requirements.
  • Performance Optimization: Implement best practices for performance tuning, partitioning, and clustering to optimize data queries and reduce costs in BigQuery.
  • Data Integration & Transformation: Collaborate with data scientists and analysts to design data solutions that integrate seamlessly with BI tools, machine learning models, and third-party applications.
  • Data Quality & Governance: Establish and enforce data quality standards, data governance frameworks, and security policies for data storage and access on GCP.
  • Automation & Monitoring: Automate workflows using Cloud Composer, Cloud Functions, or other orchestration tools to ensure reliable and scalable data pipelines.
  • Documentation & Knowledge Sharing: Create comprehensive documentation for data pipelines, workflows, and processes. Share best practices and mentor junior data engineers.

Required Qualifications:

  • 10+ years of experience working as a Data Engineer, with a focus on GCP and BigQuery.
  • Strong proficiency in SQL and experience in developing complex queries, stored procedures, and views in BigQuery.
  • Hands-on experience with GCP services such as Cloud Storage, Dataflow, Cloud Composer, and Cloud Functions.
  • Deep understanding of data warehousing concepts, dimensional modeling, and building data marts.
  • Experience with ETL/ELT tools like Apache Beam, Dataflow, or dbt.
  • Familiarity with scripting languages like Bash, Python or JavaScript for automation and integration.
  • Proven track record managing teams and projects
  • Ability to design and implement scalable, reliable, and cost-effective BigQuery solutions tailored to specific business use cases.
  • Experience creating reference architectures, data pipelines, and best practice guides.
  • Ability to translate complex technical concepts into business terms.
  • Proven experience working with both technical and non-technical stakeholders to define requirements and deliver high-quality solutions.
  • Excellent communication and interpersonal skills, with the ability to collaborate effectively with cross-functional teams.
  • GCP Professional Data Engineer Certification is a plus.

Preferred Skills:

  • Experience with machine learning on GCP using Vertex AI or AI Platform.
  • Knowledge of data governance and security best practices in a cloud environment.
  • Experience working with real-time streaming data and tools like Pub/Sub or Kafka.

Required profile

Experience

Level of experience: Expert & Leadership (>10 years)
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Collaboration
  • Communication
  • Social Skills

Solutions Architect Related jobs