Big Data Engineer Crypto (Remote)

Other
Salary: Competitive Salary
Job Type: Full time
Experience: Senior Level

Career Renew

Big Data Engineer Crypto (Remote)

Big Data Engineer Crypto | Career Renew | UK

Career Renew is recruiting for one of its clients a Big DataEngineer – Crypto – candidates need to be based in CET +/-4timezones.

We are the fastest Telegram bot on Solana, with over...

Big Data Engineer Crypto | Career Renew | UK

Career Renew is recruiting for one of its clients a Big Data Engineer – Crypto – candidates need to be based in CET +/-4 timezones.

We are the fastest Telegram bot on Solana, with over $10 billion in traded volume. We empower traders with advanced on-chain trading tools like DCA orders, limit orders, and wallet copy-trading, offering a seamless, innovative experience.

Why Join Us?

We are synonymous with speed, innovation, and cutting-edge trading solutions. This is a unique opportunity to lead and build the data infrastructure for our project, collaborating with an elite team to shape a product that directly impacts thousands of active users in a fast-growing ecosystem.

Role Overview

We are looking for a Big Data Engineer to take ownership of our data architecture, ensuring scalability, low latency, and reliability. The ideal candidate will lead the design and implementation of data pipelines, real-time processing systems, and analytics platforms that support trading decisions and insights.

Key Responsibilities

  • Data Architecture Design: Maintain a scalable, high-performance data architecture tailored for real-time trading data, trading events, and analytics
  • Tool Selection: Identify and integrate the most effective big data tools and frameworks to handle the ingestion, processing, and storage of Solana-based blockchain data
  • Real-Time Data Processing: Build and maintain stream-processing systems using tools like Apache Kafka, Spark Streaming, or Flink for real-time price feeds and trading events
  • Optimize Data Storage : Design and optimize storage solutions using a combination of in-memory databases (e.g., Redis) for active trading data and scalable databases (e.g., Cassandra, ClickHouse) for analytics
  • Performance Monitoring: Monitor, troubleshoot, and optimize the performance of the data pipeline to handle high-throughput scenarios, such as trading spikes
  • Scalability: Implement caching strategies and horizontal scaling solutions to maintain low latency and high availability
  • Security : Deploy monitoring systems (e.g., Prometheus, ELK Stack) to oversee system health, data flow, and anomalies
  • Collaboration: Work closely with engineering, product, and analytics teams to align data solutions with business goals
  • Troubleshooting: Resolve issues in the big data ecosystem and ensure high availability and reliability

Requirements

Proficiency in distributed computing principles and large-scale data management for financial or trading systems.

Proficiency in tools like Kafka, Spark, and Flink.

Strong expertise in stream-processing frameworks like Spark Streaming, Apache Flink, or Storm.

Proficiency in TypeScript with 5+ years of experience.

Proficiency in ETL tools and frameworks, such as Apache Nifi, Airflow, or Flume.

Benefits

  • Remote Flexibility: Work from anywhere while contributing to a high-impact role
  • Growth Opportunities: Be a key player in defining our data infrastructure
  • Challenging Projects: Work with cutting-edge technologies and tackle complex data challenges
  • Collaborative Culture: Join a team that values innovation, expertise, and efficiency

Show more

Show less

Tagged as: remote, remote job, virtual, Virtual Job, virtual position, Work at Home, work from home

Load more listings
When applying state you found this job on Pangian.com Remote Network.