Python with Big query

Work set-up: 
Full Remote
Contract: 
Experience: 
Senior (5-10 years)
Work from: 

Offer summary

Qualifications:

Strong proficiency in Python for scripting and data processing., Experience with BigQuery, SQL query optimization, and handling large datasets., Hands-on experience with Google Cloud Platform services such as BigQuery, Cloud Storage, and IAM., Knowledge of data pipeline development, version control, and CI/CD workflows..

Key responsibilities:

  • Design and develop scalable data pipelines using Python on GCP.
  • Write and optimize SQL queries for BigQuery data processing.
  • Automate ETL/ELT workflows and monitor data pipelines using GCP tools.
  • Collaborate with cross-functional teams to ensure data quality and performance.

Black and White Business Solutions Private Ltd logo
Black and White Business Solutions Private Ltd Human Resources, Staffing & Recruiting SME https://www.blackwhite.in
51 - 200 Employees
See all jobs

Job description











































































































































































































































































Company Name :


Job Title :

Python with Big query

Qualification :

Any graduation

Experience :

6 to 8 years

Must Have Skills :

Python Proficiency

  • Strong knowledge of Python for scripting, data processing, and automation.

  • BigQuery Expertise

  • Writing optimized SQL queries, handling large datasets, and familiarity with BigQuery performance tuning.

  • Google Cloud Platform (GCP)

  • Hands-on experience with GCP services, especially BigQuery, Cloud Storage, IAM, and Cloud Functions.

  • Data Pipeline Development

  • Building and maintaining ETL/ELT workflows using Python and orchestration tools (e.g., Cloud Composer/Airflow).

  • Version Control and CI/CD

  • Experience with Git, DevOps workflows, and deploying code/data pipelines in a cloud environment.


Good to Have Skills :

Data Modeling and Warehousing Concepts

  • Experience in schema design, star/snowflake schema, and dimensional modeling.

  • Google Cloud SDK / BigQuery API

  • Experience with BigQuery client libraries and programmatic data handling using APIs.

  • Airflow / Cloud Composer

  • Experience scheduling and monitoring workflows using Airflow on GCP.

  • Monitoring and Logging

  • Using tools like Stackdriver for alerting and performance monitoring.

  • Cloud Security & Cost Optimization

  • Understanding of GCP IAM roles, data encryption, and cost-effective data handling in BigQuery.


Roles and Responsibilities :

Design, develop, and maintain scalable data pipelines using Python on Google Cloud Platform.

  • Write complex and optimized SQL queries to process data in BigQuery.

  • Automate data extraction, transformation, and loading (ETL/ELT) workflows.

  • Collaborate with data scientists, analysts, and product teams to define data needs and implement solutions.

  • Monitor and troubleshoot data pipelines using GCP-native tools.

  • Implement data validation, testing, and version control using CI/CD pipelines.

  • Ensure data governance, quality, and compliance across systems.

  • Optimize BigQuery storage and query performance for cost efficiency.


Location :

Bangalore

CTC Range :

25 lpa

Notice period :

Immediate joiners

Shift Timings :


Mode of Interview :

virtual

Mode of Work :


Mode of Hire :


Note :




Required profile

Experience

Level of experience: Senior (5-10 years)
Industry :
Human Resources, Staffing & Recruiting

Related jobs