Match score not available

GCP Data Engineer - Mid Level

74% Flex
Remote: 
Full Remote
Contract: 
Experience: 
Senior (5-10 years)
Work from: 

Offer summary

Qualifications:

5-7+ years experience in Data Engineering, 3+ years experience with Google Cloud Platform, BigQuery, GCS, Cloud Composer / Airflow, Proficiency in Python scripting and SQL, advanced SQL skills, Unix/Linux scripts experience.

Key responsabilities:

  • Design, architect, develop, and support GCP data pipelines
  • Maintain documentation of information assets
  • Estimate work effort and attend team standups
CereCore logo
CereCore Information Technology & Services SME
501 - 1000 Employees
See more CereCore offers

Job description

Logo Jobgether

Your missions

Classification: Contract

Contract Length: 12 Months

Location: 100% Remote

Job ID

CereCore® provides EHR implementations, IT and application support, IT managed services, technical staffing, strategic IT consulting, and advisory services to hospitals and health systems nationwide. Our heritage is in the hallways of some of America’s top-performing hospitals. We have served as leaders in finance, operations, technology, and as clinicians turned power users and innovators. At CereCore, we know firsthand the power that aligned technology can provide in delivering care. As a wholly-owned subsidiary of HCA Healthcare, we are committed to bringing the expertise we have gained as operators to deliver IT services that emphatically address the needs of health systems across the United States. Our team of over 600 clinical and technical professionals has implemented EHR systems in more than 400 facilities and provides managed services support to tens of thousands of health system employees. We work tirelessly to provide healthcare organizations specialized IT services that support the delivery of patient care. The Link to Life-Saving Care.

CereCore is seeking a GCP Data Engineer – Mid Level to join our team Remotely. The GCP Data Engineer will integrate a new Trauma Burn Registry (data integration) into the Enterprise Data Warehouse, and assist on a separate project for an enhancement for an existing employee staff data integration and export. The projects are using GCP, GCS, Cloud Composer / Airflow, BigQuery, and SQL. This role is responsible for Extract, Load, Transform (ELT) data pipeline research, architecture, development, and support. The successful candidate will have excellent verbal and written communication skills, the ability to establish effective working relationships and manage multiple priorities. This position also works with business analysts and project management to review business requirements and to produce technical design specs that will meet the requirements.

Responsibilities

  • Design, architect, develop, and support GCP data pipelines to extract, load, and transform data between the EDW and vendor platform.
  • Maintain a holistic view of information assets by creating and maintaining artifacts that illustrate how information is stored, processed, and supported (i.e. documentation)
  • Working with the project team, estimate and plan the work effort.
  • Attend daily team standups.

Requirements

  • 5-7+ years experience Data Engineering
  • 3+ years experience with Google Cloud Platform. Must have experience with BigQuery, GCS, & Cloud Composer Apache Airflow
  • 5-7 years Python experience
  • Proficient with GCP tools Google Cloud Storage (GCS), BigQuery, and Cloud Composer / Airflow.
  • Proficient developing Python ELT data pipelines.
  • Proficient writing optimized GCP BigQuery SQL data translation queries and scripts.
  • Advanced SQL skills, including the ability to write, tune, and interpret SQL queries.
  • Experience writing and maintaining Unix/Linux scripts.
  • Experience with GitHub source control and CI/CD workflows.

We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.

Required profile

Experience

Level of experience: Senior (5-10 years)
Industry :
Information Technology & Services
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Soft Skills

  • Verbal Communication in Japanese
  • Relationship Management
  • Prioritization

Go Premium: Access the World's Largest Selection of Remote Jobs!

  • Largest Inventory: Dive into the world's largest remote job inventory. More than half of these opportunities can't be found on standard platforms.
  • Personalized Matches: Our AI-driven algorithms ensure you find job listings perfectly matched to your skills and preferences.
  • Application fast-lane: Discover positions where you rank in the TOP 5% of applicants, and get personally introduced to recruiters with Jobgether.
  • Try out our Premium Benefits with a 7-Day FREE TRIAL.
    No obligations. Cancel anytime.
Upgrade to Premium

Find more Data Engineer jobs