As a Data Engineer, you will be responsible for designing, developing, maintaining, and optimising a data pipeline infrastructure using a proprietary data platform , which is based on Databricks. You will collaborate with cross-functional teams to design and implement scalable data solutions, ensuring efficient data ingestion, transformation, storage, and analysis.
Data Engineering Ownership: End-to-end creation and upkeep of robust data workflows and components, leveraging tools such as Databricks. This includes designing early-stage prototypes and deploying large-scale data acquisition, handling, and storage strategies.
ETL/ELT Workflow Management: Construct and refine data ingestion pipelines to seamlessly integrate varied data sources into the central platform. Ensure data consistency through validation, cleaning, and enrichment routines.
Stakeholder Collaboration: Partner with commercial teams and data stakeholders to gather and refine specifications for analytical products and visual reporting needs.
Architectural Design & Data Modelling: Engage with analysts and data experts to define efficient data models and architectural plans. Support dashboard and report development, while ensuring optimal data structuring for performance.
Pipeline Performance & Reliability: Continuously assess and fine-tune data workflows, addressing system inefficiencies, integration problems, and data fidelity concerns.
Quality Assurance in Data: Define expectations for data integrity and collaborate with QA to automate validation checks. Utilise monitoring tools to surface and track data quality metrics.
Data Controls & Compliance: Apply internal governance standards and safeguard sensitive data through access rules, encryption protocols, and retention strategies, aligning with organisational security frameworks.
Technical Enablement & Knowledge Sharing: Work closely with interdisciplinary teams to ensure data needs are met, while thoroughly documenting processes and solutions. Convey technical ideas in a way that’s accessible across departments.
Innovation & Process Evolution: Keep informed on developments within the data ecosystem. Advocate for and implement improvements in efficiency, tooling, and automation. Participate in internal knowledge communities.
Agile Delivery Engagement: Contribute to delivery efforts by actively participating in sprint planning, stand-ups, and backlog refinement. Help define and deliver technical tasks, aligning work with CI/CD best practices.
WHAT WE ARE LOOKING FOR:
- ATTENTION! THIS POSITION IS FOR PORTUGAL OR BRAZIL BASED ONLY
Lumenalta (formerly Clevertech)
ALDIA
SynergisticIT
Recharge
Rad AI