Match score not available

Data Engineer

Remote: 
Full Remote
Contract: 
Experience: 
Mid-level (2-5 years)
Work from: 
New Jersey (USA), United States

Offer summary

Qualifications:

Experience in Data Warehousing, Proficiency in Python, Java, Scala, SQL, Knowledge of cloud services AWS, GCP, Azure, Expertise in building CI/CD pipelines, Familiarity with ETL and data integration tools.

Key responsabilities:

  • Build and maintain a Big Data platform
  • Manage data integrations within the technology stack
  • Develop complex data workflows and pipelines
  • Support data migrations and runtime solutions in cloud
  • Document architecture and operations of data systems
MHK TECH INC logo
MHK TECH INC https://www.mhktechinc.com/
11 - 50 Employees
See more MHK TECH INC offers

Job description

What You Will Do:

Assist in building a world-class Big Data platform which will give us power to process streams of data , as well as, enable machine learning and advanced analytics capabilities. Everything cloud-based for scalability and speed to market.

Handle large volumes of data and integrate our platform with a range of internal and external systems.

Understand new tech and how it can be applied to data management.

Work with an agile team alongside business, testers, architects and project managers.

Focus on the development of complex logic integrations

Maintain and evaluate quality of documentation, code, and business logic and non-functional.

Keep NFRs as priority by maintaining code, supporting, restoring, monitoring and performance for any delivery

*If you join us as a Senior, you will be a mentor to a team, and will need to bring some previous experience of this.

What You Need:

Design, implement, and extend core data system that enables reporting and data visualizations

Manages data integrations within the company's domain technology stack

Provide runtime and automation solutions that empower developers to migrate and run workloads in the public cloud

Responsible for maintaining and supporting all data workflows

Design, implement, enhancement and support of CI/CD frameworks, container solutions, runtime environments, and supporting public cloud infrastructure

Produce and maintain complex data workflows to meet all the quality requirements of the data management policy

Design, and documents database architecture

Responsible for creating and maintaining operational data store

Responsible for ingestion and extraction of data using MDM tools like Informatica, Amperity, etc.

Expertise in Data Warehousing and familiarity with cloud offerings for warehouses.

Creates and maintains diagnoses, alerting, and monitoring code.

Builds database schemas, tables, procedures, and permissions

Develops database utilities and automated reporting

Prepares written materials for the purpose of documenting activities, providing written reference, and/or conveying information

Full-stack design, development, deployment, and operation of core data stack including data lake, data warehouse, and data pipelines

Experience building data flow for data acquisition, aggregation, and modeling, using both batch and steaming paradigms

Experience working public cloud provider (AWS, GCP, Azure)

Experience building and managing CI/CD pipelines

Have created and managed Kubernetes clusters in different types of environments

Familiarity with access controls, secrets management, monitoring, and service discovery in Kubernetes clusters

Experience working with containerized workflows, applications, and drive container adoption among developers and teams

Experience building ingestion, ETL data pipelines, especially via code-oriented systems like Spark, Airflow, Luigi, or similar, and with varied data formats

Experience operating in a secure networking environment (e.g. behind a corporate proxy) is a plus

Expertise in data engineering languages such as Python, Java, Scala, SQL

Familiarity with visualizing data with Power BI, Tableau, and similar tools

Experience creating business requirements documents and/or other application systems related documents

Required profile

Experience

Level of experience: Mid-level (2-5 years)
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Problem Solving

Data Engineer Related jobs