Azure Data Engineer

Work set-up: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

Proven experience with Azure Data Bricks, PySpark, and Python., Knowledge of data lake, data warehousing, and data modeling on Azure., Hands-on experience with RDBMS platforms like MySQL, MS SQL Server, or Oracle., Familiarity with developing data pipelines, ETLs, and extracting data from APIs and cloud services..

Key responsibilities:

  • Develop and maintain big data pipelines using Azure and open-source tools.
  • Manage data lake, delta lake, and data warehousing solutions on Azure.
  • Create and optimize data processing pipelines and ETLs with PySpark and Data Bricks.
  • Work with Synapse and SQL Data Warehouse to present data securely and build data models.

Stratonik logo
Stratonik https://www.stratonik.com
11 - 50 Employees
See all jobs

Job description

Job Description:
Primary Skill Set: Azure Data Bricks, Pyspark & Python, Azure Data Factory, Data Lake,
Synapse SQL DWH
Preferred skills: DevOps, OLAP, Data Warehousing, Data Modeling
Job Description:
Handson experience in MySQLMS SQL ServerOracle or similar RDBMS platform
Developing big data pipelines using Azure and opensource big data tools and technologies
Knowledge in data lake delta lake data warehousing solutions on Azure
Experience of extracting data from APIs and Cloud Services (Salesforce, Eloqua, S3)
Adequate understanding of creating data processing pipelines ETLs using PySpark
Data Bricks
Prior experience of using Synapse SQL DWH to present data securely and to build &
manage data models

Required profile

Experience

Spoken language(s):
English
Check out the description to know which languages are mandatory.

Data Engineer Related jobs