Bachelor's degree in Computer Science or related field.
4-6 years of experience in data engineering roles.
Proficiency in Python, SQL, and cloud computing platforms like AWS.
Strong understanding of data pipelines, big data technologies, and data modeling.
Requirements:
Design, maintain, and improve data infrastructure and pipelines.
Collaborate with data science and product teams to integrate data sources.
Implement and optimize data processing systems for real-time and batch data.
Ensure data quality, security, and compliance across data systems.
Job description
Job Title: DATA ENGINEER
Location: Remote
Employment Type: Permanent role
Client: Bilge Adam(UK Based client)
Work Time: 2pm11pm
Experience: 46years
Job Description:
The Role
As a Data Engineer youd be working with us to design, maintain, and improve various analytical and operational services and infrastructure which are critical for many other functions within the organization. These include the data lake, operational databases, data pipelines, largescale batch and realtime data processing systems, a metadata and lineage repository, which all work in concert to provide the company with accurate, timely, and actionable metrics and insights to grow and improve our business using data. You may be collaborating with our data science team to design and implement processes to structure our data schemas and design data models, working with our product teams to integrate new data sources, or pairing with other data engineers to bring to fruition cuttingedge technologies in the data space.
Our Ideal Candidate
We expect candidates to have indepth experience in some of the following skills and technologies and be motivated to build up experience and fill any gaps in knowledge on the job. More importantly, we seek people who are highly logical, with a balance of respect for best practices and using their own critical thinking, adaptable to new situations, capable of working independently to deliver projects endtoend, communicates well in English, collaborates effectively with teammates and stakeholders, and eager to be on a highperforming team, taking their careers to the next level with us.
Highly relevant:
• General computing concepts and expertise: Unix environments, networking, distributed and cloud computing
• Python frameworks and tools: pip, pytest, boto3, pyspark, pylint, pandas, scikitlearn, keras
• AgileLean project methodologies and rituals: Scrum, Kanban