About the Role:
We are seeking a Data Engineer with strong AWS expertise to design, develop, and implement data pipelines and integrations for ingesting and processing data from various IWMS platforms into AWS RDS. This role requires expertise in building scalable, secure, and resilient data ingestion solutions using AWS-native services, along with the ability to create custom integration adapters using web services and serverless functions.
The ideal candidate will enforce industry best practices for data auditing, error handling, security (including Row-Level Security), and operational monitoring to ensure future production stability and data integrity.
Key Responsibilities:
Design and build data pipelines to extract, transform, and load (ETL) data from IWMS systems (e.g., TRIRIGA, ARCHIBUS, Planon) into AWS RDS
Develop integration adapters using Web Services and custom functions leveraging Java or Python, as needed
Utilize AWS services such as Lambda, Glue, Step Functions, S3, and others for data processing and orchestration
Develop frameworks for batch and near real-time data ingestion, integrating with external and internal systems
Implement industry best practices for data ingestion, auditing, error handling, Row-Level Security (RLS), and operational logging
Ensure data accuracy, consistency, security, and traceability across all data pipelines and processes
Collaborate with business analysts, product teams, and technical stakeholders to define data mapping, transformation logic, and integration specifications
Proactively troubleshoot, optimize, and tune data workflows for performance, scalability, and reliability
Enforce data security, compliance, and governance standards within all integration and data pipeline solutions
Required Skills & Qualifications:
3-7 years of experience as an AWS Data Engineer
Strong hands-on experience with AWS services like Glue, Lambda, RDS, Step Functions, and S3
Proficiency in developing integration solutions using Web Services (REST/SOAP) and building custom adapters using Java or Python
Strong expertise in SQL and Python for data processing, transformation, and workflow orchestration
Experience handling JSON, XML, and API-based integrations
Familiarity with enterprise data integration practices, including auditing, error handling, and enforcing data security policies like RLS
Experience integrating data from IWMS or ERP platforms
Good to Have:
Prior experience with IWMS platforms such as IBM TRIRIGA, ARCHIBUS, or Planon
Familiarity with big data technologies like Spark, Hive, or PySpark
AWS certification (e.g., AWS Certified Data Analytics Specialty, Solutions Architect, or Developer Associate)
Jobgether
Zepz
WorldRemit
Exact Sciences
WorldRemit