Match score not available

DBT (Data Build Tool)

Remote: 
Full Remote
Contract: 
Experience: 
Senior (5-10 years)
Work from: 

Offer summary

Qualifications:

B.E. / B.Tech. / MCA or equivalent, 4-7 years of experience in Data Engineering, Experience with Talend, DBT, S3, Snowflake, Knowledge of ETL infrastructure and data modeling, Familiarity with AWS data services and big data technologies.

Key responsabilities:

  • Design and implement Snowflake-based analytics solutions
  • Assist with requirements definition and source data analysis
  • Build and maintain data pipelines and integration processes
  • Collaborate with data teams to resolve technical issues
  • Act as a technical leader in Agile/Lean model
Indiglobe IT Solutions logo
Indiglobe IT Solutions
11 - 50 Employees
See more Indiglobe IT Solutions offers

Job description

Responsibilities:
  •  Assist in the design and implementation of Snowflake-based analytics solution (data lake and data warehouse) on AWS
  •  Requirements definition, source data analysis and profiling, the logical and physical design of the data lake and data warehouse as well as the design of data integration and publication pipelines
  • Develop Snowflake deployment and usage best practices
  •  Help educate the rest of the team members on the capabilities and limitations of Snowflake
  • Build and maintain data pipelines adhering to suggested enterprise architecture principles and guidelines
  • Design, build, test, and maintain data management systems
  • Work in sync with internal and external team members like data architects, data scientists, data analysts to handle all sorts of technical issue
  • Act as technical leader within the team
  • Working in Agile/Lean model
  • Deliver quality deliverables on time
  • Translating complex functional requirements into technical solutions.

 EXPERTISE AND QUALIFICATIONS
Essential Skills, Education and Experience
  • Should have a B.E. / B.Tech. / MCA or equivalent degree along with 4-7 years of experience in Data Engineering
  • Creation and maintenance of optimum data pipeline architecture for ingestion, processing of data
  • Creation of necessary infrastructure for ETL jobs from a wide range of data sources using Talend, DBT, S3, Snowflake.
  • Experience in Data storage technologies like Amazon S3, SQL, NoSQL
  • Data modeling technical awareness
  • Experience in working with stakeholders working in different time zones

Good to have
  • AWS data services development experience.
  • Working knowledge on using Bigdata technologies.
  • Experience in collaborating data quality and data governance team.
  • Exposure to reporting tools like Tableau
  • Apache Airflow, Apache Kafka (nice to have)
  • Payments domain knowledge
  • CRM, Accounting, etc. in depth understanding
  • Regulatory reporting exposure

Other skills
  • Good Communication skills
  • Team Player
  • Problem solver
  • Willing to learn new technologies, share your ideas and assist other team members as needed
  • Strong analytical and problem-solving skills; ability to define problems, collect data, establish facts, and draw conclusions.

Required profile

Experience

Level of experience: Senior (5-10 years)
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Problem Solving
  • Analytical Thinking
  • Verbal Communication Skills

Related jobs