Match score not available

Azure Data Factory (ADF) / Synapse Developer

Remote: 
Full Remote
Contract: 
Experience: 
Senior (5-10 years)
Work from: 

Offer summary

Qualifications:

Bachelor's or master's in computer science or related field, 6-10 years experience in Data engineering or Software development.

Key responsabilities:

  • Understand business requirements and provide data input
  • Build pipelines, implement security frameworks, integrate data sources
  • Own Data Integration pipeline, implement standards, design failover
  • Bring in Design documents, Unit Test plans, participates in code reviews
  • Follow Agile methodology, work on big data technologies
AT&T logo
AT&T Telecommunication Services XLarge https://www.att.com/
10001 Employees
See more AT&T offers

Job description

Logo Jobgether

Your missions

Job Description:

Roles & Responsibilities:
- Understand business requirement and actively provide inputs from Data perspective.
- Understand the underlying data and flow of data.
- Build simple to complex pipelines & dataflows.
- Should be able to implement modules that has security and authorization frameworks.
- Recognize and adapt to the changes in processes as the project evolves in size and function.

- To be an owner of the Data Integration pipeline.
- Bring in Data integration standards and implement the same.
- Build Dataflows, workflows and have job fail over design. 
- Build Re-usable assets and framework components.

Knowledge, Skills & Abilities: 
- Expert level knowledge on Azure Data Factory.
- Advance knowledge of Azure SQL DB & Synapse Analytics, Power BI, T-SQL, Logic Apps , Function Apps.
- Should be able to analyze and understand complex data.

- Monitoring day to day Data factory pipeline activity.

- Designing, configuring, and managing pipelines to orchestrate data workflows.

- Implementing different types of activities such as Copy Activity, Data Flow, Databricks Activity, and Control Flow activities.

- Connecting to and integrating on-premises data sources using Self-hosted Integration Runtime.

- Setting up and managing triggers (Schedule, Event, Manual) to automate pipeline executions.

- Configuring linked services to connect to various data stores and defining datasets for data structures.
- Knowledge of Azure data lake is required and Azure Services like Analysis Service, SQL Databases, Azure DevOps, CI/CD is a must.
- Knowledge of master data management, data warehousing and business intelligence architecture.
- Experience in data modeling and database design with excellent knowledge of SQL Server best practices.
- Excellent interpersonal/communication skills (both oral/written) with the ability to communicate at various levels with clarity & precision.
- Should have clear understanding of DW lifecycle and contribute in preparing Design documents, Unit Test plans, Code review reports.
- Experience working in Agile environment (Scrum, Lean, Kanban) is a plus
- Knowledge of Big data technologies - Spark Framework, NoSQL, Azure Data Bricks , Python, Snowflake, Jupiter Note  Working knowledge, R- Programming

- Knowledge on various file systems and recommend based on design. 
- MPP Design and recommend design for optimal cluster utilization.
- Expert in python and pyspark.


Qualifications & Experience:
- Bachelor's or master's degree in computer science or related field.
- At least 6-10 years of Data engineering or Software development experience.

Weekly Hours:

40

Time Type:

Regular

Location:

Bangalore, Karnataka, India

It is the policy of AT&T to provide equal employment opportunity (EEO) to all persons regardless of age, color, national origin, citizenship status, physical or mental disability, race, religion, creed, gender, sex, sexual orientation, gender identity and/or expression, genetic information, marital status, status with regard to public assistance, veteran status, or any other characteristic protected by federal, state or local law. In addition, AT&T will provide reasonable accommodations for qualified individuals with disabilities.

Required profile

Experience

Level of experience: Senior (5-10 years)
Industry :
Telecommunication Services
Spoken language(s):
Check out the description to know which languages are mandatory.

Data Engineer Related jobs