Bachelor's Degree/Diploma in Computer Science, Information Technology, or related field., 5-7 years of experience in delivering big data solutions., Proven experience with agile methodologies and multi-national environments., Hands-on experience with PHP7, Python 3, and Big Data technologies like Hadoop and Spark..
Key responsibilities:
Oversee the delivery and quality of backend development projects handling large volumes of data.
Collaborate with data science and merchant integration teams to create data pipelines.
Develop, test, and deploy code for a product used by millions in South East Asia.
Maintain high standards for data ingestion and processing pipelines, ensuring reliability and performance.
Report This Job
Help us maintain the quality of our job listings. If you find any issues with this job post, please let us know.
Select the reason you're reporting this job:
We emphasize on the areas of Outsourcing, Talent Acquisition and Talent Development.Aisling is committed to providing our candidate community with access to better future opportunities and the precise skills that would enable them to build their experiences as well as professional networks. And our customers can rely on our consistent focus to provide innovative solutions that enhance customer experience and impact business results.
RESPONSIBILITIES
• Take over full responsibility for the delivery and quality of backend development projects, handling TBs of data every day.
• Collaborate closely with the data science and merchant integration teams to translate requirements, integrations and algorithms into data pipelines
• Develop, test and deploy code that is part of a product used by tens of millions of people across South East Asia every month
• Maintain a high quality standard of our data ingestion and processing pipelines, reliably handling data in many different formats and structures
• Drive to keep improving programming and design processes and techniques.
• Quickly debug and analyze issues that affect the reliability and performance of our catalog update and scraping process
• Deliver incremental results live on our site on a weekly basis with the support of a cross-functional agile team.
REQUIREMENTS
• A Bachelor's Degree/Diploma in Computer Science, Information Technology or a related subject
• 5-7 years of experience delivering big data solutions on tight schedules in a company known for best-in-class products
• Proven experience working in a multi-national environment with agile methodologies (SCRUM, CI/CD, TDD, LESS)
• Hands-on experience building applications using PHP7, Python 3 and libraries from the related ecosystems.
• Hands-on experience in Big Data software development technologies (e.g., Hadoop, Hive, Spark, Kafka) and exposure to resource/cluster management technologies.
• Experience with Big data design, ETL (Extraction, Transformation & Load), architecting efficient software designs for Big data platform.
• Deep experience with the latest NoSQL database technologies like Elastic Search and Cassandra.
• Familiar with modern architectural patterns of highly scalable systems like horizontal scaling and queueing systems
• Unmatched attention to detail when it comes to optimizing performance of backend processes and services, never satisfied with the smallest annoyances
• Expert in Amazon Web Services, container technologies (Docker) and deployment frameworks (Terraform, Ansible)
• Familiar with best practices in Development Lifecycles, Source Control tools like Git, familiarity with the Linux command line interface
Required profile
Experience
Industry :
Human Resources, Staffing & Recruiting
Spoken language(s):
English
Check out the description to know which languages are mandatory.