Match score not available

Azure Data Integration Expert

Remote: 
Full Remote
Contract: 
Experience: 
Senior (5-10 years)
Work from: 

Offer summary

Qualifications:

6+ years’ experience in data pipelines, Proficient in ETL tools (Informatica, SSIS), Experience with Azure services (DataBricks, Data Factory), Deep knowledge of data warehouse architecture, BE/Btech in Computer Science or Engineering.

Key responsabilities:

  • Lead data extraction, transformation, and load processes
  • Design high complexity data models according to standards
  • Implement data integration solutions for structured/unstructured data
  • Assess data quality and execute cleansing when necessary
  • Integrate and ingest real-time machine data from various sources
Sequoia Connect logo
Sequoia Connect Scaleup https://www.sequoia-connect.com/
11 - 50 Employees
See more Sequoia Connect offers

Job description

Logo Jobgether

Your missions

Our client is a fast-growing automation-led next-generation service provider delivering excellence in IT, BPO, and consulting services. They are driven by a combination of robust strategies, passionate teams, and a global culture rooted in innovation and automation.

Our client’s Digital offerings have helped clients achieve operational excellence and customer delight. Their focus lies on taking a leadership position in helping clients attain customer intimacy as their competitive advantage. They are now on a journey of metamorphosing the experiences of the customer’s customers by leveraging our industry-leading delivery and execution model, built around the strategy— Automate Everything™, Cloudify Everything™, Transform Customer Experiences™.

Powering our client’s complex technology solutions and services is the Bottom-Up Disruption, a disruptive crowdsourcing initiative that brings about innovation and improvement to everyday complexities and, ultimately, growing the client’s business. The digitally empowered workforce of our client represents various nationalities, comprising 19,833 employees, and lives the company’s philosophy of ‘customer success, first and always’. The company reported a 2020 global revenue of USD $845.04 Mn.

We are currently searching for an Azure Data Integration Expert:

Responsibilities

  • Leads the delivery processes of data extraction, transformation, and load from disparate sources into a form that is consumable by analytics processes, for projects with moderate complexity, using strong technical capabilities and sense of database performance
  • Designs, develops and produces data models of relatively high complexity, leveraging a sound understanding of data modelling standards to suggest the right model depending on the requirement
  • Batch Processing - Capability to design an efficient way of processing high volumes of data where a group of transactions is collected over a period
  • Data Integration (Sourcing, Storage and Migration) - Capability to design and implement models, capabilities, and solutions to manage data within the enterprise (structured and unstructured, data archiving principles, data warehousing, data sourcing, etc.). This includes the data models, storage requirements and migration of data from one system to another
  • Data Quality, Profiling and Cleansing - Capability to review (profile) a data set to establish its quality against a defined set of parameters and to highlight data where corrective action (cleansing) is required to remediate the data
  • Stream Systems - Capability to discover, integrate, and ingest all available data from the machines that produce it, as fast as it is produced, in any format, and at any quality
  • Excellent interpersonal skills to build network with variety of department across business to understand data and deliver business value and may interface and communicate with program teams, management and stakeholders as required to deliver small to medium-sized projects
  • Understand the difference between on-prem and cloud-based data integration technologies.

Requirements

  • 6+ years’ experience in developing large scale data pipelines in a cloud/on-prem environment.
  • Highly Proficient in any or more of market leading ETL tools like Informatica, DataStage, SSIS, Talend, etc.,
  • Experiencie in Azure: Databricks, Data Factory, Azure Integration etc.
  • Deep knowledge in Data warehouse/Data Mart architecture and modelling
  • Define and develop data ingest, validation, and transform pipelines.
  • Deep knowledge of distributed data processing and storage
  • Deep knowledge of working with structured, unstructured, and semi structured data
  • Working experience needed with ETL/ELT patterns
  • Extensive experience in the application of analytics, insights and data mining to commercial “real-world” problems
  • Technical experience in any one programming language preferably, Java, .Net or Python
  • BE/Btech in Computer Science, Engineering or relevant field

Languages

  • Advanced Oral English.
  • Native Spanish.

Note:

  • Fully remote

If you meet these qualifications and are pursuing new challenges, Start your application to join an award-winning employer. Explore all our job openings | Sequoia Career’s Page: https://www.sequoiags.com/careers/.

Required profile

Experience

Level of experience: Senior (5-10 years)
Industry :
Spoken language(s):
Check out the description to know which languages are mandatory.

Soft Skills

  • verbal-communication-skills
  • social-skills
loading