Data Engineer

Alchemy | Web3 development made easy

About

One magic line of code supercharges your development with the world’s most powerful blockchain developer platform, relied upon by the majority of the world's top blockchain apps.

Overview

San Francisco, CA, USA

Job Description

Our mission is to bring blockchain to a billion people. The Alchemy Platform is a world class developer platform designed to make building on the blockchain easy. We’ve built leading infrastructure in the space, powering over $45 billion in transactions for tens of millions of users in 99% of countries worldwide.
The Alchemy team draws from decades of deep expertise in massively scalable infrastructure, AI, and blockchain from leadership roles at leading companies and universities like Google, Microsoft, Facebook, Stanford, and MIT.
Alchemy recently raised a Series C led by a16z at a $3.5B valuation, having previously raised from Coatue, Addition, Stanford University, Coinbase, the Chairman of Google, Charles Schwab, and the founders and executives of leading organizations.
Alchemy powers the top blockchain companies globally and has been featured in TechCrunchForbesBloomberg, and elsewhere.
The Role
Data Infrastructure is foundation on which all of Alchemy’s product are built. As an engineer focused on data engineering, you’ll be working with one of the most sophisticated and high-throughput distributed systems in the blockchain world. You’ll focus on architecting and building new systems as well as improving existing ones for a platform that supports millions of users globally.

Responsibilities

      • Maintain Alchemy’s batch pipelines that power our production serving systems
      • Set up frameworks and tools to help team members create and debug pipelines by themselves
      • Track data quality and latency, and set up monitors and alerts to ensure smooth operation
      • Build production DAG workflows for batch data processing and storage
      • Aggregate logs from multiple regions and multiple clouds
      • Design and implement our next generation data warehouse that aggregates internal and third-party data sources

What We’re Looking For

    • BS (or higher) degree in Computer Science or similar
    • 4+ years experience in a software engineering discipline, with at least 2+ years experience in data engineering or data infrastructure
    • (Bonus) Experience in low-latency, streaming data architectures
    • (Bonus) Prior experience in Airflow/Spark/ELK Stack
    • Self-starter attitude and the ability to execute new ideas with autonomy
    • Know how to find the right balance between perfection and shipping quickly
    • Passion for blockchain technologies a plus
Apply for job

To apply for this job please visit jobs.lever.co.

×

Add New Company