Match score not available

Big Data Engineer - Crypto

Remote: 
Full Remote
Salary: 
19 - 19K yearly
Experience: 
Senior (5-10 years)
Work from: 

Offer summary

Qualifications:

5+ years proficiency in TypeScript, Strong expertise in stream-processing frameworks, Proficiency in Kafka, Spark, and Flink, Experience with ETL tools like Apache Nifi and Airflow.

Key responsabilities:

  • Design and maintain scalable data architecture
  • Build and optimize real-time data processing systems
Career Renew logo
Career Renew Small startup https://career-renew.com/
2 - 10 Employees
See more Career Renew offers

Job description

Career Renew is recruiting for one of its clients a Big Data Engineer - Crypto - candidates need to be based in CET +/-4 timezones.

We are the fastest Telegram bot on Solana, with over $10 billion in traded volume. We empower traders with advanced on-chain trading tools like DCA orders, limit orders, and wallet copy-trading, offering a seamless, innovative experience.

Why Join Us?

We are synonymous with speed, innovation, and cutting-edge trading solutions. This is a unique opportunity to lead and build the data infrastructure for our project, collaborating with an elite team to shape a product that directly impacts thousands of active users in a fast-growing ecosystem.

Role Overview

We are looking for a Big Data Engineer to take ownership of our data architecture, ensuring scalability, low latency, and reliability. The ideal candidate will lead the design and implementation of data pipelines, real-time processing systems, and analytics platforms that support trading decisions and insights.

Key Responsibilities

  • Data Architecture Design: Maintain a scalable, high-performance data architecture tailored for real-time trading data, trading events, and analytics.
  • Tool Selection: Identify and integrate the most effective big data tools and frameworks to handle the ingestion, processing, and storage of Solana-based blockchain data.
  • Real-Time Data Processing: Build and maintain stream-processing systems using tools like Apache Kafka, Spark Streaming, or Flink for real-time price feeds and trading events.
  • Optimize Data Storage : Design and optimize storage solutions using a combination of in-memory databases (e.g., Redis) for active trading data and scalable databases (e.g., Cassandra, ClickHouse) for analytics.
  • Performance Monitoring: Monitor, troubleshoot, and optimize the performance of the data pipeline to handle high-throughput scenarios, such as trading spikes.
  • Scalability: Implement caching strategies and horizontal scaling solutions to maintain low latency and high availability.
  • Security : Deploy monitoring systems (e.g., Prometheus, ELK Stack) to oversee system health, data flow, and anomalies.
  • Collaboration: Work closely with engineering, product, and analytics teams to align data solutions with business goals.
  • Troubleshooting: Resolve issues in the big data ecosystem and ensure high availability and reliability.

Requirements

Proficiency in distributed computing principles and large-scale data management for financial or trading systems.

Proficiency in tools like Kafka, Spark, and Flink.

Strong expertise in stream-processing frameworks like Spark Streaming, Apache Flink, or Storm.

Proficiency in TypeScript with 5+ years of experience.

Proficiency in ETL tools and frameworks, such as Apache Nifi, Airflow, or Flume.

Benefits

  • Remote Flexibility: Work from anywhere while contributing to a high-impact role.
  • Growth Opportunities: Be a key player in defining our data infrastructure.
  • Challenging Projects: Work with cutting-edge technologies and tackle complex data challenges.
  • Collaborative Culture: Join a team that values innovation, expertise, and efficiency.

Required profile

Experience

Level of experience: Senior (5-10 years)
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Collaboration
  • Troubleshooting (Problem Solving)

Data Engineer Related jobs