Data Analyst 3 (REMOTE)

Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

Minimum of 3 years of experience in building and maintaining ETL/ELT pipelines using Azure-native tools., Proficiency in SQL, Python, or PySpark for data transformation and cleansing., Experience with Azure Data Factory, Dataflows, and Synapse Pipelines., Knowledge of data quality frameworks and responsible data handling techniques..

Key responsibilities:

  • Support the design and delivery of secure and scalable data pipelines in Azure.
  • Build ingestion workflows and transformation logic for reporting and analytics.
  • Collaborate with a software development team in an Agile/Scrum environment.
  • Ensure data quality and auditability while managing sensitive datasets.

Serigor Inc logo
Serigor Inc Information Technology & Services SME https://www.serigor.com/

Job description

Job Title: Data Analyst 3 (REMOTE)
Location: Richmond, VA
Duration: 12+ Months

Job Description:
The client is seeking a skilled data engineer with a minimum of 3 years’ experience to work with an existing software development team on an ongoing project.
  • This position will support the design and delivery of secure, scalable, and traceable data pipelines in Azure, with a focus on integrating and transforming sensitive, public-facing datasets.
  • The candidate will build ingestion workflows and transformation logic to support reporting, analytics, and public transparency initiatives.
  • This role requires experience with Azure-native tools, strong ETL/ELT capabilities, and a strong commitment to data quality, auditability, and responsible data use.
  • The candidate will be working with an existing software development team in an Agile/Scrum environment.

Skills:
 
SkillRequired / DesiredAmountof Experience
 At least 3 years of experience building and maintaining ETL/ELT pipelines in enterprise environments using Azure-native tools.Required3Years
 Hands-on expertise with Azure Data Factory, Dataflows, Synapse Pipelines, or similar orchestration tools.Required3Years
 Proficiency in SQL, Python, or PySpark for transformation logic and data cleansing workflows.Required3Years
 Experience with Delta Lake, Azure Data Lake Storage Gen2, JSON, and Parquet formats.Required3Years
 Ability to build modular, reusable pipeline components using metadata-driven approaches and robust error handling.Required3Years
 Familiarity with public data sources, government transparency datasets, and publishing workflows.Required3Years
 Knowledge of data masking, PII handling, and encryption techniques to manage sensitive data responsibly.Required3Years
 Experience with data quality frameworks, including automated validation, logging, and data reconciliation methods.Required3Years
 Strong grasp of DevOps/DataOps practices, including versioning, testing, and CI/CD for data pipelines.Required3Years
 Experience supporting data publishing for oversight, regulatory, or open data initiatives is highly desirable.Required3Years
 Certifications such as DP-203 (Azure Data Engineer Associate) or Azure Solutions Architect are a plus.Highly desired3Years

Required profile

Experience

Industry :
Information Technology & Services
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Teamwork
  • Communication

Data Analyst Related jobs