Match score not available

Senior Data Engineer (Remote Options)

Remote: 
Full Remote
Contract: 
Experience: 
Senior (5-10 years)
Work from: 
Massachusetts (USA), United States

Offer summary

Qualifications:

Bachelor's Degree, 5 years of relevant experience, Familiarity with cloud platforms (GCP, AWS), Experience with geospatial and timeseries data, Understanding of SDLC and Agile methodology.

Key responsabilities:

  • Design and develop ETL processes
  • Ensure efficient data integration and quality
  • Automate ETL workflows for efficiency
  • Collaborate to create and troubleshoot data solutions
  • Optimize data storage for performance and scalability
Trinnex logo
Trinnex https://www.trinnex.io
51 - 200 Employees
See more Trinnex offers

Job description

Why Trinnex?

If you are passionate about water and technology, Trinnex is the place for you! Trinnex is a visionary company that is transforming the way water resources are managed and protected. By combining cutting-edge digital technologies, such as sensor/IoT data, models, geospatial data, and AI/machine learning, we create innovative, smart, and scalable solutions that make a difference. Whether it's optimizing water supply and demand, detecting leaks and anomalies, or enhancing water quality and resilience, Trinnex delivers value and impact to public sector clients across the country.


Job Description

Trinnex is seeking a Senior Data Engineer to join our growing Digital Engineering team. Trinnex is building next generation tools that integrate sensor/IoT data, models, and geospatial data and machine learning to solve unique engineering and environmental issues.


This role requires a deep understanding of various data sources, data modeling, ETL processes, and scripting. You will work on integration of client data sources and enterprise applications with existing Trinnex products and new custom developed solutions. Example data sources include ESRI GIS, Customer Information Systems (CIS), CMMS, IoT, relational databases and other data systems. Ideally, you should have extensive experience in the development of ETL’s, data management, and usage of APIs.


Responsibilities in this role include:

• Designing and Developing ETL Processes: Creating ETL workflows that extract data from source systems, transform it into a suitable format, and load it into target systems such as data warehouses.

• Data Integration: Ensuring the smooth and efficient transfer of data between different systems, consolidating data from multiple sources, and maintaining data quality.

• Automation: Automating ETL workflows to save time and reduce errors, enabling faster data integration and analysis.

• Data Quality and Consistency: Ensuring that data is complete, accurate, and consistent throughout the ETL process.

• Collaboration: Working with other data professionals to design and implement data models, troubleshoot data integration issues, and ensure data quality

• Implements the development of software solutions for custom projects. Ensures development documentation for applications is performed and complete.

• Collaborate with software developers, data scientists, and other stakeholders to understand data requirements and deliver high-quality data solutions.

• Optimize data storage solutions, including data warehouses and data lakes, for performance and scalability.

• Stay up to date with the latest industry trends and technologies in data architecture and recommend best practices.

• Support creativity, efficient decision making and elegant code. Write and review clean code. Generate reusable code libraries.

• Liaise with developers, designers, and DevOps to identify new features, and review code and deliverables.

• Performs other duties as required.


Minimum Qualifications

• Bachelor's Degree.

• 5 years of relevant experience.

• Equivalent additional directly related experience will be considered in lieu of a degree.


Preferred Qualifications

• Familiarity with cloud platforms (specifically GCP, and Azure/AWS) and Kubernetes cluster environments.

• Experience with geospatial and timeseries data.


Skills and Abilities

• Demonstrated understanding of complete SDLC and Agile methodology.

• Experience with software development tools, specifically Git, project management tools, CI/CD pipelines.

• Python, and other relevant programming languages: Using languages to develop, test, and deploy data applications and systems.

• PostgreSQL, MongoDB, and other databases: Using databases to store, manage, and query data.

• Cloud platforms (preferably GCP): Using platforms to host, scale, and secure data solutions.

• Must be able to balance management skills with technical work, including code-writing and code review.

• Experience with Cloud ETL tools and deployment.

• Excellent problem solving and research skills.

• Excellent verbal and written communication and collaboration skills for leading discussions and meetings with team members, users, and external stakeholders.

• Ability to translate client requirements into detailed specifications.

• Possesses excellent attention to detail.

• Excellent interpersonal, presentation, and leadership skills to cultivate strategic relationships with colleagues, customers, and partners.

Required profile

Experience

Level of experience: Senior (5-10 years)
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Collaboration
  • Communication
  • Social Skills
  • Leadership
  • Detail Oriented
  • Problem Solving

Data Engineer Related jobs