Logo for Neumo

BI Engineer I (Remote)

Roles & Responsibilities

  • 1–3 years of experience in data engineering, data analytics, business intelligence, or related data roles
  • Experience with PySpark or Apache Spark concepts
  • Exposure to Microsoft Fabric, Azure, and Lakehouses or similar cloud data platforms
  • Understanding of CI/CD principles for data or software projects

Requirements:

  • Develop PySpark notebooks within Microsoft Fabric and build, monitor, and support data pipelines and Dataflow Gen2 processes
  • Write and optimize SQL queries to support data transformations and analytics; manage environment variables and configuration across development environments
  • Implement and maintain data quality rules, monitoring, and exception handling across bronze, silver, and gold data layers; validate data quality and troubleshoot data issues
  • Prepare semantic models for reporting; implement RBAC and data security models across Lakehouses and semantic models; maintain pipeline documentation and collaborate on version control and deployment (Azure DevOps or GitHub)

Job description

Job Summary:

The BI Engineer I is an entry‑to‑mid‑level data professional focused on ingesting, validating, and transforming data within Microsoft Fabric to support reliable, downstream analytics. This role emphasizes building and maintaining robust data pipelines, ensuring data quality and consistency, and implementing foundational data models that enable scalable reporting and automation.

Duties and Responsibilities

  • Develop PySpark Notebooks within Microsoft Fabric.
  • Build, monitor, and support data pipelines and Dataflow Gen2 processes.
  • Write and optimize SQL queries to support data transformations and analytics.
  • Manage environment variables and configuration settings across development environments.
  • Implement and maintain data quality rules, monitoring, and exception handling across bronze, silver, and gold data layers.
  • Maintain documentation of data pipelines, from ingest to gold layer.
  • Prepare semantic models for reporting.
  • Implement role-based access controls and data security models across Lakehouses and semantic models.
  • Collaborate with team members using Azure DevOps and/or GitHub for version control and deployment.
  • Validate data quality/accuracy and assist in troubleshooting data issues.
  • Follow established data engineering standards and best practices.
  • Perform other duties as assigned.

Education and Experience: 

  • 1–3 years of experience in data engineering, data analytics, business intelligence, or other data-related roles.
  • Exposure to Microsoft Fabric, Azure, and Lakehouses or similar cloud data platforms.
  • Experience with PySpark or Apache Spark concepts.
  • Understanding of CI/CD principles for data or software projects.
  • Experience in SaaS, GovTech, or technology-enabled services organizations.

Knowledge, Skills and Abilities:

  • High data quality with low reporting error rates.
  • Accurate and timely delivery of reports and dashboards.
  • Reduction in manual effort to generate reporting.
  • Improved data accessibility and transparency across all parts of the business.
  • Skilled with Python and SQL.
  • Understanding of data engineering concepts (ETL/ELT, data pipelines, data modeling).
  • Familiarity with dimensional data modeling concepts.
  • Familiarity with Git-based source control (Azure DevOps or GitHub preferred).
  • Strong problem-solving skills and attention to detail.


Work Environment:

  • Office setting with a moderate noise level.
  • The employee will work at an individual workstation, using a telephone and computer.


Physical Demands:

  • Must be able to remain seated for extended periods.
  • Regular use of a computer and other office machinery, such as printers and copy machines.
  • Occasional movement around the office.
  • Frequent communication via telephone.

 
Neumo Summary:

With the backing of four decades of public sector expertise and corporate capability, Neumo has successfully supported government services. Neumo was honored and recognized for four (4) consecutive years as a GovTech 100 Company representing the top 100 companies focused on making a difference in and selling to state and local government agencies across the United States.

Neumo is committed to helping communities thrive and brings a wealth of experience combined with innovation. Today, Neumo offers more administrative and financial support to government officials than any other organization. And with a responsive, client-focused approach, we foster partnerships that give our customers the certainty they need to accomplish more.

Neumo offers a competitive benefits and compensation package and are looking for team members who will thrive in our dynamic environment.

Neumo is an Equal Opportunity Employer. Selection for a position will be made without regard to race, religion, national origin, sex, political affiliation, marital status, non-disqualifying physical handicap, and age.

BI Developer Related jobs

Other jobs at Neumo

We help you get seen. Not ignored.

We help you get seen faster — by the right people.

🚀

Auto-Apply

We apply for you — automatically and instantly.

Save time, skip forms, and stay on top of every opportunity. Because you can't get seen if you're not in the race.

AI Match Feedback

Know your real match before you apply.

Get a detailed AI assessment of your profile against each job posting. Because getting seen starts with passing the filters.

Upgrade to Premium. Apply smarter and get noticed.

Upgrade to Premium

Join thousands of professionals who got noticed and hired faster.