Bachelor's degree in IT, Computer Science, or related field., Minimum of 3+ years of experience in ETL development and Big Data tools (e.g., Spark, Python)., Strong knowledge of AWS services (e.g., S3, Lambda, Glue, Redshift)., Advanced English skills for communication and collaboration..
Key responsabilities:
Develop, test, and maintain scalable ETL workflows for Big Data processing.
Build and optimize data pipelines and architectures to support business requirements.
Collaborate with engineering and business teams to define data needs and solutions.
Apply best practices for data security, compliance, and governance.
Report This Job
Help us maintain the quality of our job listings. If you find any issues with this job post, please let us know.
Select the reason you're reporting this job:
SAPINDEX IS A SPECIALIZED COMPANY
IN PROVIDING CONSULTING SERVICES, IT SOLUTIONS AND
OUTSOURCING IN A WIDE RANGE OF SECTORS
OF THE COMPUTER MARKET. WE FOCUS ON SAP
AND OUR HIRING IS ONE OF THE HIGHEST QUALITY IN THE MARKET.
Location: COLOMBIA (Remote with occasional travel).
Duration: Full-time.
Mode: Freelance.
Job Overview: We are looking for an experienced Data Engineer to join our innovative team.
The ideal candidate will be responsible for designing, building, and maintaining secure, scalable, and efficient data pipelines in cloud-based environments, specifically AWS.
You will also work on optimizing workflows, supporting analytics needs, and collaborating with cross-functional teams to deliver impactful solutions.
Key Responsibilities: Develop, test, and maintain scalable ETL workflows for Big Data processing.
Build and optimize data pipelines and architectures to support business requirements.
Implement and manage AWS services for data lakes and analytics solutions.
Collaborate with engineering and business teams to define data needs and solutions.
Apply best practices for data security, compliance, and governance.
Troubleshoot and optimize processes for reliability and performance.
Requirements: Bachelor's degree in IT, Computer Science, or related field.
Minimum of 3+ years of experience in ETL development and Big Data tools (e.g., Spark, Python).
Strong knowledge of AWS services (e.g., S3, Lambda, Glue, Redshift).
Experience With Terraform For Infrastructure As Code (preferred).
Familiarity with SAP-related projects (a plus).
Proficiency in data modeling, governance, and management.
Advanced English skills for communication and collaboration.
Experience in agile development methodologies and tools Benefits: Work on exciting projects with cutting-edge technologies.
Flexibility to work remotely with occasional travel.
A dynamic, multicultural, and collaborative environment.
Opportunities for professional growth and innovation.
If you meet the requirements and are interested in joining our team, you can apply here or send your CV to **** with the subject: I&O Data Engineer
Required profile
Experience
Spoken language(s):
English
Check out the description to know which languages are mandatory.