Snowflake Data Engineer

Work set-up: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

Proficiency in designing and developing data pipelines and ETL processes using Snowflake features., Experience with data modeling, schema design, and performance optimization within Snowflake., Strong skills in SQL, query tuning, and database performance analysis., Knowledge of cloud technologies such as AWS, including S3, EC2, and data integration tools..

Key responsibilities:

  • Design, develop, and maintain data pipelines for Snowflake data ingestion and transformation.
  • Implement and optimize ETL processes and data models within Snowflake.
  • Monitor and enhance query performance and resource utilization in Snowflake.
  • Work on data integration, security, and compliance aspects like data masking and RBAC.

iSoftTek Solutions Inc logo
iSoftTek Solutions Inc Hrtech: Human Resources + Technology Scaleup https://www.linkedin.com/
201 - 500 Employees
See all jobs

Job description

Job Title: Snowflake Data Engineer

Location: VA

Duration: 2 Years

Job Type: C2C

Work Type: Remote

Job Description

Are you a Data Engineer working at a Large Financial Institution and being told by your leadership that you are too handson or detailoriented or think and work like a startup?

We are looking forward to you joining our Platform Engineering Team.

Our Platform Engineering Team is working to solve the Multiplicity Problem. We are trusted by some of the most reputable and established FinTech Firms. Recently, our team has spearheaded the Conversion & Go Live of apps that support the backbone of the Financial Trading Industry.

We are looking for Engineers who can

● Design, develop, and maintain data pipelines to ingest, transform, and load data from various sources into Snowflake.

● Implement ETL (Extract, Transform, Load) processes using Snowflakes features such as Snowpipe, Streams, and Tasks.

● Design and implement efficient data models and schemas within Snowflake to support reporting, analytics, and business intelligence needs.

● Optimize data warehouse performance and scalability using Snowflake features like clustering, partitioning, and materialized views.

● Integrate Snowflake with external systems and data sources, including onpremises databases, cloud storage, and thirdparty APIs.

● Implement data synchronization processes to ensure consistency and accuracy of data across different systems.

● Monitor and optimize query performance and resource utilization within Snowflake using query profiling, query optimization techniques, and workload management features.

● Identify and resolve performance bottlenecks and optimize data warehouse configurations for maximum efficiency.

● Work on Snowflake modeling – roles, databases, schemas, ETL tools with clouddriven skills

● Work on SQL performance measuring, query tuning, and database tuning

● Handle SQL language and cloudbased technologies

● Set up the RBAC model at the infra and data level.

● Work on Data Masking Encryption Tokenization, Data Wrangling ECreLT Data Pipeline orchestration (tasks).

● Setup AWS S3EC2, Configure External stages, and SQSSNS

● Perform Data Integration e.g. MSK Kafka connect and other partners like Delta Lake (data bricks)

We work closely with

★ Data Wrangling

★ ETL

★ Talend

★ Jasper

★ Java

★ Python

★ Unix

★ AWS

★ Data Warehousing

★ Data Modeling

★ Database Migration

★ ECreLT

★ RBAC model

★ Data migration

Kindly please share your resumes with srikar@isoftteckinc.com or 7074353471

Required profile

Experience

Industry :
Hrtech: Human Resources + Technology
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Data Engineer Related jobs