Match score not available

Data Engineer (3-6 years)_Bharti_PW

Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

3-6 years of experience, Knowledge of data warehouses and ETL.

Key responsabilities:

  • Curate and manipulate data from large sources
  • Develop data products and ETL processes
  • Create scalable data models
  • Collaborate with different teams
  • Manage data pipeline's SLA
CodersBrain logo
CodersBrain Management Consulting SME https://www.codersbrain.com/
201 - 500 Employees
See more CodersBrain offers

Job description

Greetings from Coders Brain Technology Pvt. Ltd.
Coders Brain is a global leader in its services, digital, and business solutions that partners with its clients to simplify, strengthen, and transform their businesses. We ensure the highest levels of certainty and satisfaction through a deep-set commitment to our clients, comprehensive industry expertise, and a global network of innovation and delivery centers.

Experience required: 3-6 Years
Location: Remote
Key skills: Data warehouses (Snowflake, Big Query, Hadoop/Hive, or equivalent),Data Modeling,
ETL/ELT design, programming languages such as Python, Java or Scala ,Advanced SQL, Snowflake/Big Query/Hadoop/Hive or similar tools
Designation: Sr. Associate

Roles and Responsibilities:
● Providing the organisationʼs data consumers high quality data sets by data curation,
consolidation, and manipulation from a wide variety of large scale (terabyte and
growing) sources.
● Building first-class data products and ETL processes that interact with terabytes of data
on leading platforms such as Snowflake and Big Query.
● Developing and improving Foundational datasets by creating efficient and scalable data
models to be used across the organization.
● Working with our Data Science, Analytics, CRM, and Machine Learning teams.
● Responsible for the data pipelinesʼ SLA and dependency management.
● Writing technical documentation for data solutions and presenting at design reviews.
Solving data pipeline failure events and implementing anomaly detecting.

Technical skills:
Must Have :
● Advanced SQL
● Snowflake/Big Query/Hadoop/Hive or similar tools
● Data modeling and ETL/ELT design
● Programming experience in any language i.e. Python, Java, Scala, etc. is good to have
● Experience with technologies and toolingʼs associated with relational databases
(PostgreSQL or equivalent) and data warehouses (Snowflake, Big Query, Hadoop/Hive,
or equivalent).
● Experience with writing and optimizing SQL. Experience with data modeling and
● ETL/ELT design, including defining SLAs, performance measurements, tuning and
monitoring.
● Experience building/operating highly available, distributed systems of data extraction,
ingestion, and processing of large data sets.
● Experience with programming languages such as Python, Java and/or Scala
● Knowledge of cloud data warehouse concepts

Soft Skills :
● Excellent verbal and written communication skills
● Strong interpersonal skills and the ability to work in a fast-paced and dynamic
environment.
● Ability to make progress on projects independently and enthusiasm for solving difficult
problems

Required profile

Experience

Industry :
Management Consulting
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Non-Verbal Communication
  • Social Skills

Data Engineer Related jobs