Match score not available

Data Engineer - Data Intensive Applications

Remote: 
Full Remote
Contract: 
Experience: 
Mid-level (2-5 years)
Work from: 

Offer summary

Qualifications:

Bachelor’s or Master’s degree in Computer Science or related field, 2+ years of experience in data engineering, Expertise in Python and/or Go, Experience with event-driven architecture and queue tools.

Key responsabilities:

  • Design and develop high-performance data pipelines
  • Collaborate with teams for seamless data integration
TRACTIAN 𝗕𝗥 logo
TRACTIAN 𝗕𝗥 Scaleup https://tractian.com/
51 - 200 Employees
See all jobs

Job description

Engineering at TRACTIAN

The Engineering team at TRACTIAN is at the forefront of developing cutting-edge infrastructure, technologies, and products to harness the power of IoT data. Our team of talented Engineers collaborates to build robust systems, innovative solutions, and scalable platforms that drive Tractian's success. We are instrumental in shaping the company's decision-making process, optimizing operational efficiency, and delivering exceptional experiences to our consumers.


What you'll do

As a Data Engineer, your primary responsibility will be to design, build, and optimize data pipelines and systems capable of managing and processing large data volumes in batches or in event-driven services. Your daily activities will involve working with cutting-edge tools and technologies to ensure the efficiency, reliability, and scalability of data solutions using Python, Go, and/or Rust.


Responsibilities
  • Design, develop, and maintain high-performance data pipelines and processing systems using Python, Go, and/or Rust.
  • Implement and optimize queue mechanisms and tools to manage high-volume data streams effectively.
  • Collaborate with cross-functional teams to ensure seamless integration of data systems within the larger infrastructure.
  • Optimize ETL/ELT workflows and other data processing pipelines for efficiency and reliability in handling large datasets.
  • Identify and resolve performance bottlenecks in data pipelines and storage solutions.

  • Requirements
  • Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
  • +2 years of experience in data engineering or related fields with a strong focus on data-intensive applications.
  • Expertise in Python and/or Go programming languages.
  • Experience with event-driven architecture and queue tools like Kafka, RabbitMQ, or similar.
  • Proficient understanding of distributed systems, data processing frameworks, and advanced algorithms.
  • Good knowledge of database technologies (Postgres, Scylla, MongoDB, Redis).
  • Good knowledge of cloud systems, specially AWS

  • Bonus Points
  • Knowledge of low-level programming languages such as Rust and C/C++
  • Experience with columnar/OLAP databases such as ClickHouse, QuestDB, TimescaleDB
  • Experience with data lake/warehouse technologies such as Delta and Iceberg
  • Experience in fast-paced and/or early stage tech startups

  • Compensation
  • Competitive salary and stock options
  • Optional fully funded English / Spanish courses
  • 30 days of paid annual leave
  • Education and courses stipend
  • Earn a trip anywhere in the world every 4 years
  • Day off during the week of your birthday
  • Up to R$1.000/mo for meals and remote work allowance
  • Health plan with national coverage and without coparticipation
  • Dental Insurance: we help you with dental treatment for a better quality of life.
  • Gympass and Sports Incentive: R$300/mo extra if you practice activities
  • Required profile

    Experience

    Level of experience: Mid-level (2-5 years)
    Spoken language(s):
    English
    Check out the description to know which languages are mandatory.

    Other Skills

    • Collaboration
    • Problem Solving

    Data Engineer Related jobs