Skill | Required / Desired | Amount | of Experience |
At least 3 years of experience building and maintaining ETL/ELT pipelines in enterprise environments using Azure-native tools. | Required | 3 | Years |
Hands-on expertise with Azure Data Factory, Dataflows, Synapse Pipelines, or similar orchestration tools. | Required | 3 | Years |
Proficiency in SQL, Python, or PySpark for transformation logic and data cleansing workflows. | Required | 3 | Years |
Experience with Delta Lake, Azure Data Lake Storage Gen2, JSON, and Parquet formats. | Required | 3 | Years |
Ability to build modular, reusable pipeline components using metadata-driven approaches and robust error handling. | Required | 3 | Years |
Familiarity with public data sources, government transparency datasets, and publishing workflows. | Required | 3 | Years |
Knowledge of data masking, PII handling, and encryption techniques to manage sensitive data responsibly. | Required | 3 | Years |
Experience with data quality frameworks, including automated validation, logging, and data reconciliation methods. | Required | 3 | Years |
Strong grasp of DevOps/DataOps practices, including versioning, testing, and CI/CD for data pipelines. | Required | 3 | Years |
Experience supporting data publishing for oversight, regulatory, or open data initiatives is highly desirable. | Required | 3 | Years |
Certifications such as DP-203 (Azure Data Engineer Associate) or Azure Solutions Architect are a plus. | Highly desired | 3 | Years |
Aristocrat IT Solutions Pvt. Ltd.
TieTalent
Discover Financial Services
YipitData
Wing