4+ years of hands-on experience as a Data Engineer., Strong programming skills in Python and advanced proficiency in SQL., Experience with ETL/ELT pipelines using Spark and SQL, and cloud-hosted data warehouses., BA/BSc in Computer Science, Engineering, Information Systems, or equivalent..
Key responsibilities:
Lead the development of scalable and efficient data pipelines from ingestion to delivery.
Design and implement data processing workflows for business analytics and machine learning.
Drive the migration from Hadoop to a Spark-based architecture on AWS.
Collaborate with cross-functional teams to define data requirements and ensure data quality.
Report This Job
Help us maintain the quality of our job listings. If you find any issues with this job post, please let us know.
Select the reason you're reporting this job:
Globaldev Group provides end-to-end software development services and builds skillful teams of specialists to fuel your business’ growth. Leveraging our extensive 12-year background and deep industry knowledge, we have established ourselves as a reliable partner for numerous enterprises, SMEs, and startups.
▪️️ Global Teams ▪️
Globaldev offers R&D team extensions to both emerging technology companies and global corporations. Empowered by our global talent pool, we excel at connecting our clients with high-skilled specialists.
Hiring, onboarding, and continuous management are all included in our comprehensive set of services, enabling our clients to quickly scale their teams and achieve their product development and implementation goals.
▪️ Global Engineering ▪️
- Obtain a custom-designed solution from scratch
- Scale or enhance your current software infrastructure
- Stay ahead with the latest cutting-edge technologies
- Access a comprehensive range of services from a single vendor
As a full-stack development company, we are equipped to fulfill all your business requirements. Whether it's product discovery, proof of concept, UX design, or development, Globaldev serves as your all-in-one solution provider.
We are looking for a seasoned Data Engineer to join our growing data team.
Responsibilities
Own and lead the development of scalable, efficient, and robust data pipelines (ETL/ELT) from ingestion to delivery.
Design and implement data processing workflows to support business analytics and machine learning.
Drive the migration of our current Hadoop-based infrastructure to a Spark-based architecture on AWS.
Collaborate with cross-functional teams including data analysts, data scientists, and software engineers to define data requirements and optimize data flow.
Ensure data quality, observability, and compliance across systems.
Requirements
4+ years of hands-on experience as a Data Engineer.
Strong programming skills in Python and advanced proficiency in SQL.
Hands-on experience with developing ETL/ELT pipelines using Spark and SQL.
Experience working with cloud-hosted data warehouses such as Hive, Snowflake.
Solid understanding and experience with the Hadoop ecosystem and distributed computing tools (e.g., EMR, Hive, Presto/Trino, Athena, AWS Glue).
Proven expertise in data warehousing concepts, data modeling, and best practices.
BA/BSc in Computer Science, Engineering, Information Systems, or equivalent.
Excellent analytical skills and a strong attention to detail.
Strong communication skills in English (written and verbal).
Demonstrated ability to work both independently and in a collaborative team environment.
Will be a plus
Familiarity with Kafka, Confluent, Fluentd, Spark, and Airflow.
Experience with data visualization tools (e.g., Tableau).
Previous experience in the media or TV industry.
What we offer
Flexible work arrangements.
20 working days per year is a Non-Operational Allowance and settled to be use for personal recreation matters and are compensated in full.
Collaborative and supportive team culture.
Truly competitive salary.
Help and support from our caring HR team.
Required profile
Experience
Industry :
Information Technology & Services
Spoken language(s):
English
Check out the description to know which languages are mandatory.