Match score not available

Data Engineer (Indexation)

extra holidays - extra parental leave - fully flexible
Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

Strong knowledge of Python and SQL, preferably with BQ, Clickhouse, and Postgres., Production experience with Kafka and familiarity with Kubernetes., General understanding of GCP, including Cloud SQL, VM, and Storage., English proficiency at B2 level or higher..

Key responsabilities:

  • Extracting and processing data from various systems and data stores.
  • Managing and maintaining complex data pipelines and troubleshooting issues.
  • Collaborating with cross-functional teams to define data requirements.
  • Continuously improving and optimizing data processes for business needs.

Job description

P2P.org is the largest staking and restaking operator, with a TVL of over $10B 🔝.

We are constantly focused on launching new yield products: for example, in Polkadot (adding +15-20% to APR) and Ethereum, where we offer significantly higher APR (+40%) than any other staking operator 💪

We also keep an eye on exciting projects and launch new networks such as TON, Avail, Monad, and Babylon. We strongly believe in Bitcoin and the DeFi ecosystem around it, and we have a dedicated team focused on finding the best yield solutions based on Bitcoin.

We work with partners like BitGo, Crypto.com, Ledger and ByBit.

We are actively expanding our product line, exploring RWA, data, yield, and service products for exchanges, custodians, and banks

P2P.org unites talented individuals globally ❤️

Despite our distributed team, we share a passion for decentralized finance - a fairer system for all. We code, learn, create, and connect to shape finance's future 💰

P2P.org boasts a strong reputation and network. We prioritize customer satisfaction and, as tech enthusiasts, develop innovative solutions that bolster our brand.

Key responsibilities:

  • Extracting and processing data from diverse systems and data stores, ensuring smooth data flows and integration.

  • Managing and maintaining complex data pipelines, troubleshooting, and resolving any issues that arise.

  • Collaborating with cross-functional teams to define data requirements and ensure proper data handling across various platforms.

  • Continuously improving and optimizing data processes to meet the growing demands of the business.

  • Ensuring data consistency, integrity, and security throughout the integration process.


Requirements:

  • Strong knowledge: Python, SQL- any syntax, preferably BQ, Clickhouse, Postgres

  • Production experience with Kafka

  • Experience with Kubernetes

  • General understanding and experience with GCP - Cloud SQL, VM, Storage

  • Friendliness and willingness to help colleagues

  • English level: B2+

At P2P.org we have a team of experts with their own unique approach and ownership culture. Together we gain experience and make dreams come true! 🌟

  • Competitive salary level in $ (we can also pay in Crypto)

  • Well-being program

  • Mental Health care program

  • Compensation for education, including Foreign Language & professional growth courses

  • Equipment & co-working reimbursement program

  • Overseas conferences, community immersion

  • Positive and friendly communication culture

P2P.org is an equal opportunity employer. All applicants will be considered for employment without regard to race, color, national origin, religion, sex, sexual orientation, gender identity, veteran status, or disability.

Required profile

Experience

Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Willingness To Learn
  • Friendliness
  • Communication

Data Engineer Related jobs