Match score not available

NIFI Data Engineer (NDE)

extra holidays
Remote: 
Full Remote
Experience: 
Mid-level (2-5 years)
Work from: 

Offer summary

Qualifications:

Bachelor's degree with 4+ years experience, English proficiency (at least B2 level), Apache Flink, Debezium, Apache Kafka experience, Data management best practices knowledge.

Key responsabilities:

  • Develop big data solutions and optimize data workflows
  • Plan and manage tests for quality assurance
ARHS Group logo
ARHS Group Large https://www.arhs-group.com/
1001 - 5000 Employees
See more ARHS Group offers

Job description

Company Description

Arηs is a fully independent group of companies specialized in managing complex IT projects and systems for large organisations, focusing on state-of-the-art software development, digital trust, cloud, data science, mobile development, machine learning and infrastructure services.

We are composed of 16 entities across 8 countries worldwide that are unified by the Arηs Group, with more than 2500 consultants. This corporate structure enables us to respond quickly to market changes and customer requests, and to communicate and make decisions without layers of bureaucracy.

Established in 2016, the Greek entity Arηs Developments Hellas aims to extend Arηs Group activities dedicated to the European market providing high-quality services in Software Development, covering the entire application development lifecycle.

Job Description
  • Developing big data solutions 
  • Designing, developing, and optimizing data workflows
  • Creating applications and data pipelines that efficiently handle large volumes of data
  • Ensuring optimal performance, scalability, and security
  • Plans and manages tests
  • Support the project team and the customer on all issues related to quality management

Qualifications
  • Bachelor plus a minimum of 4 years of relevant professional experience 
  • Presents excellent command of English, at least at B2 level
  • Hands-on experience on Apache Flink with Apache Beam
  • Hands-on experience on CDC technology based on Debezium and Apache Kafka  
  • Hands-on experience on Transactional Data Lake
  • Works unassisted and guides, junior staff
  • Writes specifications, concept documents, end user and technical documentation, communicates effectively with stakeholders on technical and user requirements matters
  • Strong analytical and problem-solving skills with a focus on security and data integrity
  • Excellent communication and teamwork abilities
  • Committed to continuous learning and staying current with industry trends
  • Proactive in adopting new technologies and best practices for data management
  • Ability to work in a fast-paced and dynamic environment and manage complex data workflows

Desirable : 

  • CDP Certified Professional and/or CDP Data Engineer Certified 
  • Apache Flink data flows development experience
  • ETL data flow development experience
  • HDFS based storage solutions knowledge
  • Cloudera Data Platform (CDP)
  • Data warehouse based solution development
  • Apache Iceberg
  • 2 years CDC technology based on Debezium and Apache Kafka
  • Continuous learning through Cloudera University and other professional development courses

Required profile

Experience

Level of experience: Mid-level (2-5 years)
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Verbal Communication Skills
  • Team Effectiveness
  • Problem Solving
  • Open Mindset
  • Analytical Skills

Data Engineer Related jobs