Match score not available

Azure Data Integration Expert

Remote: 
Full Remote
Contract: 
Experience: 
Senior (5-10 years)
Work from: 

Offer summary

Qualifications:

6+ years of experience in data pipelines, Proficient in ETL tools like Informatica, Experience with Azure services: Databricks, Data Factory, Deep knowledge of data architecture and modelling, BE/Btech in Computer Science or relevant field.

Key responsabilities:

  • Lead data extraction, transformation, and load processes
  • Design and develop complex data models
  • Create efficient batch processing solutions
  • Manage data integration and migration projects
  • Review and cleanse data quality, profiling as needed
Sequoia Connect logo
Sequoia Connect Scaleup https://www.sequoia-connect.com/
11 - 50 Employees
See more Sequoia Connect offers

Job description

Esta vacante viene de la bolsa de empleo Talenteca.com

Vacante para la empresa Sequoia Connect en Benito Juárez, Ciudad de México

Our client is a fast-growing automation-led next-generation service provider delivering excellence in IT, BPO, and consulting services. They are driven by a combination of robust strategies, passionate teams, and a global culture rooted in innovation and automation.

Our client’s Digital offerings have helped clients achieve operational excellence and customer delight. Their focus lies on taking a leadership position in helping clients attain customer intimacy as their competitive advantage. They are now on a journey of metamorphosing the experiences of the customer’s customers by leveraging our industry-leading delivery and execution model, built around the strategy— Automate Everything, Cloudify Everything, Transform Customer Experiences.

Powering our client’s complex technology solutions and services is the Bottom-Up Disruption, a disruptive crowdsourcing initiative that brings about innovation and improvement to everyday complexities and, ultimately, growing the client’s business. The digitally empowered workforce of our client represents various nationalities, comprising 19,833 employees, and lives the company’s philosophy of ‘customer success, first and always’. The company reported a 2020 global revenue of USD $845.04 Mn.

We are currently searching for an Azure Data Integration Expert:

Responsibilities

  • Leads the delivery processes of data extraction, transformation, and load from disparate sources into a form that is consumable by analytics processes, for projects with moderate complexity, using strong technical capabilities and sense of database performance
  • Designs, develops and produces data models of relatively high complexity, leveraging a sound understanding of data modelling standards to suggest the right model depending on the requirement
  • Batch Processing - Capability to design an efficient way of processing high volumes of data where a group of transactions is collected over a period
  • Data Integration (Sourcing, Storage and Migration) - Capability to design and implement models, capabilities, and solutions to manage data within the enterprise (structured and unstructured, data archiving principles, data warehousing, data sourcing, etc.).This includes the data models, storage requirements and migration of data from one system to another
  • Data Quality, Profiling and Cleansing - Capability to review (profile) a data set to establish its quality against a defined set of parameters and to highlight data where corrective action (cleansing) is required to remediate the data
  • Stream Systems - Capability to discover, integrate, and ingest all available data from the machines that produce it, as fast as it is produced, in any format, and at any quality
  • Excellent interpersonal skills to build network with variety of department across business to understand data and deliver business value and may interface and communicate with program teams, management and stakeholders as required to deliver small to medium-sized projects
  • Understand the difference between on-prem and cloud-based dataintegration technologies.

Requirements

  • 6+ years’ experience in developing large scale data pipelines in acloud/on-prem environment.
  • Highly Proficient in any or more of market leading ETL tools likeInformatica, DataStage, SSIS, Talend, etc.,
  • Experiencie in Azure: Databricks, Data Factory, Azure Integration etc.
  • Deep knowledge in Data warehouse/Data Mart architecture andmodelling
  • Define and develop data ingest, validation, and transform pipelines.
  • Deep knowledge of distributed data processing and storage
  • Deep knowledge of working with structured, unstructured, and semistructured data
  • Working experience needed with ETL/ELT patterns
  • Extensive experience in the application of analytics, insights and data mining to commercial “real-world” problems
  • Technical experience in any one programming language preferably, Java, .Net or Python
  • BE/Btech in Computer Science, Engineering or relevant field

Languages

  • Advanced Oral English.
  • Native Spanish.

Note

  • Fully remote

If you meet these qualifications and are pursuing new challenges, Start your application to join an award-winning employer.Explore all our job openings | Sequoia Career’s Page: *

Nivel De Educación Deseada

Superior - titulado

Nivel De Experiencia Deseada

Nivel Experto

Función Departamental

Tecnología / Internet

Industria

Desarrollo de Software / Programación

Habilidades

  • ETL & ELT
  • .Net
  • DataStage
  • SSIS
  • Python

Esta Vacante Viene De La Bolsa De Empleo Talenteca.com

https://www.talenteca.com/anuncio?j_id=66e9c3a82300005300a5305d&source=linkedin

Required profile

Experience

Level of experience: Senior (5-10 years)
Industry :
Spoken language(s):
EnglishEnglishSpanish
Check out the description to know which languages are mandatory.

Other Skills

  • Problem Solving
  • Analytical Thinking
  • Social Skills

Related jobs