Match score not available

Infrastructure Data Engineer (Remote) – 4197

72% Flex
Remote: 
Full Remote
Contract: 
Salary: 
150 - 190K yearly
Experience: 
Senior (5-10 years)
Work from: 

Offer summary

Qualifications:

Bachelor's degree in CS or related field, 5+ years of professional experience, Experience in ETL orchestration and workflow management tools, Expert knowledge of Python/Java and SQL.

Key responsabilities:

  • Design and implement data infrastructure and pipelines
  • Create ETL frameworks for code quality
  • Ensure accuracy and consistency of data processing
  • Communicate ideas to various stakeholders
HIRECLOUT logo
HIRECLOUT
51 - 200 Employees
See more HIRECLOUT offers

Job description

Logo Jobgether

Your missions

HireClout

Our client is a leader in the B2B software space, helping businesses and individuals increase their productivity and workflow, using a variety of software products. Join a team where you can make an impact and revolutionize what productivity means in the modern world!

What You Will Be Doing

  • Design, develop and implement large scale, high-volume, high-performance data infrastructure and pipelines for Data Lake and Data Warehouse
  • Create and implement ETL frameworks to improve code quality and reliability
  • Create and support common design patterns to increase code maintainability
  • Ensure accuracy and consistency of data processing, results, and reporting
  • Design cloud-native data pipelines, automation routines, and database schemas that can be leveraged to do predictive and prescriptive machine learning
  • Communicate ideas clearly to various sponsors, business analysts, and technical resources

What You Will Need

  • S. in Computer Science, Software Engineering, or relevant field
  • 5+ years of relevant professional experience
  • 3+ years of experience working in data engineering, business intelligence, or a similar role
  • 2+ years of experience in ETL orchestration and workflow management tools like Airflow, flink, etc. using AWS/GCP (i.e., Airflow, Luigi, Prefect, Dagster, digdag.io, Google Cloud Composer, AWS Step Functions, Azure Data Factory, UC4, Control-M)
  • 1+ years of experience with the Distributed data/similar ecosystem (Spark, Hive, Druid, Presto) and streaming technologies such as Kafka/Flink
  • Expert knowledge of at least one programming language such as Python/Java, Python preferred
  • Expert knowledge of SQL Familiarity with DevOps SnowFlake, Netezza, Teradata, AWS Redshift, Google BigQuery, Azure Data Warehouse, or similar
  • Experience with cloud service providers: Microsoft Azure, Amazon AWS
  • Expertise with containerization orchestration engines (Kubernetes)

Why Us

Benefits And Perks

  • Competitive Salary : $150,000 – $190,000 per Year
  • Full Health, Vision, and Dental Coverage

Applicants must be currently authorized to work in the United States on a full-time basis now and in the future.

This position does not offer sponsorship.

REF: JOB-4333

Share this job

First name

Last name

Email

Attach resume*

  • Job type: Permanent
  • Location:
  • Date posted: Posted 1 month ago
  • Salary:$150000 - $190000 per Year

Required profile

Experience

Level of experience: Senior (5-10 years)
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Soft Skills

  • Strong Communication
  • Team Collaboration

Go Premium: Access the World's Largest Selection of Remote Jobs!

  • Largest Inventory: Dive into the world's largest remote job inventory. More than half of these opportunities can't be found on standard platforms.
  • Personalized Matches: Our AI-driven algorithms ensure you find job listings perfectly matched to your skills and preferences.
  • Application fast-lane: Discover positions where you rank in the TOP 5% of applicants, and get personally introduced to recruiters with Jobgether.
  • Try out our Premium Benefits with a 7-Day FREE TRIAL.
    No obligations. Cancel anytime.
Upgrade to Premium

Find more Data Engineer jobs