Azure Data Engineer

Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

Bachelor’s or master’s degree in Computer Science, Data Engineering, or a related field., 4+ years of hands-on experience in data engineering with Azure cloud services and advanced Databricks., Strong analytical and problem-solving skills in handling large-scale data pipelines., Expertise in designing and implementing data pipelines for ETL workflows..

Key responsibilities:

  • Design, implement, and optimize data workflows using Azure Databricks and Azure Data Factory.
  • Develop and optimize PySpark code for efficient data transformations.
  • Design and implement automated CI/CD pipelines for data workflows using tools like Jenkins or Azure DevOps.
  • Collaborate with cross-functional teams to understand data requirements and deliver scalable solutions.

Awign logo
Awign Information Technology & Services Scaleup https://www.awign.com/
201 - 500 Employees
See all jobs

Job description

Role: Azure Data Engineer

Experience: 4 + to 8 Years
Location: Remote
Shift timings : 11am - 8pm (IST)


Mandatory Skills: Azure Cloud Technologies, Azure Data Factory, Azure Databricks (Advance Knowledge), PySpark, CI/CD Pipeline (Jenkins, GitLab CI/CD or Azure DevOps), Data Ingestion, SOL

Seeking a skilled Data Engineer with expertise in Azure cloud technologies, data pipelines, and big data processing. The ideal candidate will be responsible for designing, developing, and optimizing scalable data solutions.

Responsibilities

1. Azure Databricks and Azure Data Factory Expertise:
 Demonstrate proficiency in designing, implementing, and optimizing data workflows using Azure Databricks and Azure Data Factory.
 Provide expertise in configuring and managing data pipelines within the Azure cloud environment.

2. PySpark Proficiency:
 Possess a strong command of PySpark for data processing and analysis.
 Develop and optimize PySpark code to ensure efficient and scalable data transformations.

3. Big Data & CI/CD Experience:
 Ability to troubleshoot and optimize data processing tasks on large datasets. Design and implement automated CI/CD pipelines for data workflows.  This involves using tools like Jenkins, GitLab CI/CD, or Azure DevOps to automate the building, testing, and deployment of data pipelines.

4. Data Pipeline Development & Deployment:
 Design, implement, and maintain end-to-end data pipelines for various data sources and destinations.
 This includes unit tests for individual components, integration tests to ensure that different components work together correctly, and end-to-end tests to verify the entire pipeline's functionality.
 Familiarity with Github/Repo for deployment of code
 Ensure data quality, integrity, and reliability throughout the entire data pipeline.

5. Extraction, Ingestion, and Consumption Frameworks:
 Develop frameworks for efficient data extraction, ingestion, and consumption.
 Implement best practices for data integration and ensure seamless data flow across the organization.

6. Collaboration and Communication:
 Collaborate with cross-functional teams to understand data requirements and deliver scalable solutions.
 Communicate effectively with stakeholders to gather and clarify data-related requirements.



Requirements
1. Bachelor’s or master’s degree in Computer Science, Data Engineering, or a related field.
2. 4+ years of relevant hands-on experience in data engineering with Azure cloud services and advanced Databricks.
3. Strong analytical and problem-solving skills in handling large-scale data pipelines.
4. Experience in big data processing and working with structured & unstructured datasets.
5. Expertise in designing and implementing data pipelines for ETL workflows.
6. Strong proficiency in writing optimized queries and working with relational databases.
7. Experience in developing data transformation scripts and managing big data processing using PySpark..

Required profile

Experience

Industry :
Information Technology & Services
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Problem Solving
  • Collaboration
  • Communication
  • Analytical Skills

Data Engineer Related jobs