Match score not available

Data Architect

unlimited holidays - extra holidays - extra parental leave - long remote period allowed
Remote: 
Full Remote
Contract: 
Experience: 
Mid-level (2-5 years)
Work from: 

Offer summary

Qualifications:

Extensive experience with data technologies, Strong understanding of data architecture and practices, Bachelor's/Master's in Computer Science or similar, Experience in data engineering/architecture roles, Proficiency with relational and graph databases.

Key responsabilities:

  • Design scalable data architectures
  • Implement data mesh architectures for sharing
  • Manage data catalogs for governance and discovery
  • Develop ETL processes and integrate data sources
  • Ensure compliance and optimize system performance
Intellectsoft logo
Intellectsoft Computer Software / SaaS SME https://www.intellectsoft.net/
51 - 200 Employees
See more Intellectsoft offers

Job description

We are Intellectsoft - a digital transformation consultancy group and engineering company that delivers cutting-edge solutions for global organisations and technology startups. Since 2007, we have been helping companies and established brands reimagine their business through digitalization. We're looking for an exceptional Data Arhitect to join our team. Are you up for a challenge?

We are seeking a skilled Data Architect to design, manage, and optimize our data solutions. The ideal candidate will develop comprehensive system architectures that centralize, integrate, maintain, and protect data sources, ensuring scalability, security, and alignment with business objectives.

Requirements

  • Extensive experience with various data technologies, including Apache Kafka, Databricks, MongoDB, Snowflake, and other market-leading tools.
  • Strong understanding of data architecture, infrastructure design, and best practices.
  • Proficiency in designing and implementing scalable, secure, and high-performance data solutions.
  • Experience with data mesh architectures and data catalog tools like Alation, Informatica, or Collibra.
  • Experience with big data technologies such as Hadoop, Spark, and Flink.
  • Experience with data modeling, database design, and ETL processes.
  • Proficiency with relational databases (e.g., MySQL, PostgreSQL).
  • Proficiency with graph databases (e.g., Neo4j).
  • Knowledge of data security best practices and compliance requirements.
  • Strong problem-solving skills and ability to troubleshoot and resolve complex technical issues.
  • Excellent communication skills for interacting with stakeholders at all levels.
  • Ability to work collaboratively in a team environment.
  • Bachelor's or Master's degree in Computer Science, Information Technology, or a related field.
  • Several years of experience in data architecture, data engineering, or a related role.
Nice to have skills
  • Relevant certifications in data architecture or technologies.
  • Familiarity with machine learning and data science tools and frameworks.
  • Experience with DevOps practices and tools (e.g., Jenkins, Docker, Kubernetes).
  • Familiarity with serverless architectures and microservices.
  • Knowledge of AWS data-related services (e.g., Amazon Redshift, AWS Glue, Amazon RDS, Amazon S3, Amazon DynamoDB).
Responsibilities
  • Design and implement scalable data architectures using various data technologies, including Apache Kafka, Databricks, MongoDB, Snowflake, and other market-leading tools.
  • Develop and implement data mesh architectures to decentralize data ownership and facilitate data sharing across various teams.
  • Design and manage data catalogs using tools like Alation, Informatica, or Collibra to improve data discovery, governance, and metadata management.
  • Develop detailed architecture blueprints and documentation that outline system components, data flows, and security protocols.
  • Create integration strategies for various data sources and systems, ensuring seamless data flow and interoperability.
  • Design and implement data warehousing solutions to consolidate data from multiple sources.
  • Develop ETL processes using tools such as Apache NiFi, Talend, or Informatica.
  • Implement big data technologies like Hadoop, Spark, and Flink to handle large-scale data processing and analytics.
  • Work with relational databases to support diverse data storage and retrieval needs.
  • Work with graph databases to support complex relationship and network analysis.
  • Ensure compliance with industry standards and regulations (e.g., GDPR, HIPAA, CCPA).
  • Implement security best practices using identity and access management tools and encryption technologies.
  • Monitor and optimize system performance for high availability, reliability, and responsiveness.
  • Provide technical guidance and support to development teams, ensuring architectural alignment with business goals.
  • Collaborate with stakeholders, including business leaders and technical teams, to gather requirements and define solution specifications.
  • Stay updated with the latest advancements in data architecture and tools.
  • Continuously improve data architecture processes and methodologies by leveraging new technologies.

Benefits

  • 35 paid absence days per year for work-life balance of each specialist + 1 additional day for each following year of cooperation with the company
  • Up to 15 unused absence days can be add to income after 12 month of cooperation
  • Health insurance for you
  • Depreciation coverage for personal laptop usage for project needs
  • Udemy courses of your choice
  • Regular soft-skills trainings
  • Excellence Сenters meetups

Required profile

Experience

Level of experience: Mid-level (2-5 years)
Industry :
Computer Software / SaaS
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Problem Solving
  • Verbal Communication Skills

Data Architect Related jobs