Match score not available

Sr. Data Engineer

Remote: 
Full Remote
Contract: 
Experience: 
Senior (5-10 years)
Work from: 

Offer summary

Qualifications:

5+ years managing various data platforms, 7+ years working with ETL tools, Bachelor's or Master's degree in a related field, Excellent proficiency in Python and SQL.

Key responsabilities:

  • Develop and optimize robust data pipelines
  • Collaborate with partners to refine dimensional models

New American Funding logo
New American Funding

Job description

Overview:

PositionSr. Data Warehouse Engineer

 

Location: Remote - USA

 

Compensation: 145-160, DOE

 

Position Summary: 

We are seeking a highly motivated, hands-on Senior Data Engineer to lead the design of complex, scalable cloud data warehouse and big data solutions. The successful candidate will be responsible for managing the entire data engineering lifecycle—from gathering requirements and design through implementation and obtaining business partner signoff. This includes developing robust data pipelines, architectures, and datasets to support our data-driven decision-making process. The role also involves implementing data validation and quality checks to ensure accuracy and consistency. Experience in developing dimensional models and collaborating with business partners is essential.

 

Responsibilities:

Responsibilities:

  • Develop, maintain, and optimize robust data pipelines across multiple data platforms using a variety of tools to ensure efficient data flow and accessibility for analytics and reporting purposes.
  • Collaborate with business partners to develop and refine dimensional models, ensuring they effectively deliver the analytics necessary for informed, data-driven decision-making.
  • Deliver high-quality code by implementing comprehensive data validation and quality checks to ensure data accuracy, consistency, and reliability across all datasets. Establish monitoring and alerting mechanisms to proactively identify and address post-implementation quality issues, including timeliness, accuracy, completeness, and validity.
  • Engage promptly and professionally with cross-functional teams—including Data Analytics, Product, Engineering, and Business Partners—to gather requirements, provide insights, and ensure solutions align with organizational objectives.
  • Actively stay abreast of the latest industry trends and technologies, applying and demonstrating this knowledge to drive continuous improvement and maintain a competitive edge.
Qualifications:

Qualifications:

  • 5+ years in managing and utilizing various data platforms, including Databricks, Snowflake, BigQuery, and Microsoft Fabric/Synapse, with a particular emphasis on Databricks and Snowflake.
  • 7+ years working with ETL tools such as Fivetran, Azure Data Factory, Microsoft Fabric, and Apache Airflow, ensuring efficient data extraction, transformation, and loading processes.
  • Bachelor's or Master's degree in Computer Science, Engineering, or a related technical field.
  • Excellent Proficiency in programming languages such as Python \ SQL
  • Demonstrated expertise in ingesting, parsing, and analyzing unstructured and semi-structured data from various sources, including but not limited to Cosmos DB. Proficient in implementing best practices to transform such data into structured formats, ensuring its availability and usability for advanced analytics and data-driven decision-making.
  • Demonstrated experience with Continuous Integration and Continuous Deployment (CI/CD) workflows utilizing Azure DevOps. Proficient in implementing best practices, including committing code frequently, maintaining green builds, automating tests, and enforcing deployment through the pipeline, to ensure efficient and reliable software delivery
  • Demonstrated experience in data engineering within Agile frameworks, actively participating in backlog refinement, grooming, and sprint planning sessions to estimate and prioritize work efforts effectively. Proficient in using Agile practices to enhance team collaboration and project delivery.
  • Excellent verbal and written communication and collaboration skills.

 

PREFERRED QUALIFICATIONS

  • Snowflake and/or Databricks Certifications
  • AWS and/or Microsoft certifications.
  • Familiarity in Mortgage Space.

 

Work Authorization: Must be able to verify identity and employment eligibility to work in the U.S.

 

#LI-JD2

 

#LI-REMOTE

Required profile

Experience

Level of experience: Senior (5-10 years)
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Collaboration
  • Communication
  • Problem Solving

Data Engineer Related jobs