EasyLayer Bitcoin Crawler Documentation
Bitcoin Crawler is a self-hosted application that enables monitoring of the blockchain state both historically and in real-time
Overview
Bitcoin Crawler is a powerful self-hosted application designed for monitoring and analyzing the Bitcoin blockchain. It provides developers with a flexible framework to track blockchain state both historically and in real-time, enabling them to build custom blockchain analytics and monitoring solutions.
The application is built on modern architectural patterns including CQRS (Command Query Responsibility Segregation) and Event Sourcing, ensuring reliable and consistent data processing. It offers multiple transport options (RPC, WebSocket, TCP, IPC) for accessing blockchain data and supports both SQLite and PostgreSQL for event storage.
Key Features
- Self-Hosted Architecture: Full control over deployment and customization
- Flexible Node Connectivity: Works with your own Bitcoin node or providers like QuickNode
- Real-time & Historical Processing: Process blockchain data from any block height with automatic reorganization support
- Custom Model Definition: Define your own data models using TypeScript/JavaScript
- Event-Based Processing: Create and handle custom events to track blockchain state changes
- Multiple Transport Options: Access data through HTTP, WebSocket, TCP, or IPC protocols
- Database Flexibility: Choose between SQLite (managed) or PostgreSQL (self-configured)
Performance (TODO)
Bitcoin Crawler is engineered for high-speed operation, but actual performance is primarily influenced by two factors: network latency when fetching blocks from the blockchain and the efficiency of inserting large datasets into database, depending on your model structure.
Setup
Prerequisites
- Node.js version 17 or higher
- A Bitcoin self node or QuickNode provider URL
Installation
Install the package using your preferred package manager:
# Using npm
npm install @easylayer/bitcoin-crawler
# Using yarn
yarn add @easylayer/bitcoin-crawler
Basic Usage
The @easylayer/bitcoin-crawler package exports a bootstrap
function that initializes the crawler. Here's a basic setup:
import { bootstrap } from '@easylayer/bitcoin-crawler';
import Model from './model';
bootstrap({
Models: [Model],
rpc: true,
});
Creating a Custom Model
Define your custom model by extending the base Model
class:
import { BasicEvent, EventBasePayload, Model, Block } from '@easylayer/bitcoin-crawler';
// Define your custom event
export class YourCustomEvent<T extends EventBasePayload> extends BasicEvent<T> {};
// Create your model
export default class CustomModel extends Model {
address: string = '1A1zP1eP5QGefi2DMPTfTL5SLmv7DivfNa';
balance: string = '0';
constructor() {
super('uniq-model-id'); // This ID will be used to fetch events and state
}
public async parseBlock({ block }: { block: Block }) {
// Implement this method to process blocks
// Create custom events using this.apply(new YourCustomEvent(data))
}
private onYourCustomEvent({ payload }: YourCustomEvent) {
// Handle your custom event
// Update model state based on the event payload
// See examples in the repository for detailed implementations
}
}
Bootstrap Configuration
The bootstrap
function accepts the following configuration options:
interface BootstrapOptions {
Models: ModelType[]; // Array of your custom models
rpc?: boolean; // Enable RPC transport
ws?: boolean; // Enable WebSocket transport
tcp?: boolean; // Enable TCP transport
ipc?: boolean; // Enable IPC transport
}
You can enable multiple transports simultaneously and define multiple models for different business logic domains.
Transport API Reference
This document describes how clients can interact with the application via RPC, IPC, WS and TCP transports.
1. HTTP RPC (Queries Only)
The HTTP RPC transport allows clients to perform data retrieval queries using a standardized JSON-RPC-like protocol.
Connection Details
POST `https://localhost:3000/`
Content-Type: application/json
Available Queries
The application provides two main query types:
- GetModels - Retrieve model states at a specific block height
- FetchEvents - Retrieve events with pagination and filtering
GetModels Query
Retrieves the current state of one or more models at a specified block height.
Request Format
{
"requestId": "uuid-1001",
"action": "query",
"payload": {
"constructorName": "GetModels",
"dto": {
"modelIds": ["model1", "model2"],
"filter": {
"blockHeight": 100
}
}
}
}
Parameters
Parameter | Type | Required | Description |
---|---|---|---|
modelIds | string[] | Yes | Array of model IDs to retrieve |
filter.blockHeight | number | No | Block height to get state at (defaults to latest) |
Response Format
{
"requestId": "uuid-1001",
"action": "queryResponse",
"payload": [
{
"aggregateId": "model1",
"state": { /* model state */ }
},
{
"aggregateId": "model2",
"state": { /* model state */ }
}
]
}
FetchEvents Query
Retrieves events for one or more models with pagination and filtering options.
Request Format
{
"requestId": "uuid-1002",
"action": "query",
"payload": {
"constructorName": "FetchEvents",
"dto": {
"modelIds": ["model1"],
"filter": {
"blockHeight": 100
},
"paging": {
"limit": 10,
"offset": 0
}
}
}
}
Parameters
Parameter | Type | Required | Description |
---|---|---|---|
modelIds | string[] | Yes | Array of model IDs to fetch events for |
filter.blockHeight | number | No | Filter events by block height |
filter.version | number | No | Filter events by version |
paging.limit | number | No | Number of events to return (default: 10) |
paging.offset | number | No | Number of events to skip (default: 0) |