Skip to main content

EasyLayer Bitcoin Crawler Documentation

Bitcoin Crawler is a self-hosted application that enables monitoring of the blockchain state both historically and in real-time


Overviewโ€‹

Bitcoin Crawler is a powerful self-hosted application designed for monitoring and analyzing the Bitcoin blockchain. It provides developers with a flexible framework to track blockchain state both historically and in real-time, enabling them to build custom blockchain analytics and monitoring solutions.

The application is built on modern architectural patterns including CQRS (Command Query Responsibility Segregation) and Event Sourcing, ensuring reliable and consistent data processing. It offers multiple transport options (RPC, WebSocket, TCP, IPC) for accessing blockchain data and supports both SQLite and PostgreSQL for event storage.

Key Featuresโ€‹

  • Self-Hosted Architecture: Full control over deployment and customization
  • Flexible Node Connectivity: Works with your own Bitcoin node or providers like QuickNode
  • Real-time & Historical Processing: Process blockchain data from any block height with automatic reorganization support
  • Custom Model Definition: Define your own data models using TypeScript/JavaScript
  • Event-Based Processing: Create and handle custom events to track blockchain state changes
  • Multiple Transport Options: Access data through HTTP, WebSocket, TCP, or IPC protocols
  • Database Flexibility: Choose between SQLite (managed) or PostgreSQL (self-configured)

Performance (TODO)โ€‹

Bitcoin Crawler is engineered for high-speed operation, but actual performance is primarily influenced by two factors: network latency when fetching blocks from the blockchain and the efficiency of inserting large datasets into database, depending on your model structure.

Setupโ€‹

Prerequisitesโ€‹

  • Node.js version 17 or higher
  • A Bitcoin self node or QuickNode provider URL

Installationโ€‹

Install the package using your preferred package manager:

# Using npm
npm install @easylayer/bitcoin-crawler

# Using yarn
yarn add @easylayer/bitcoin-crawler

Basic Usageโ€‹

The @easylayer/bitcoin-crawler package exports a bootstrap function that initializes the crawler. Here's a basic setup:

main.ts
import { bootstrap } from '@easylayer/bitcoin-crawler';
import Model from './model';

bootstrap({
Models: [Model],
rpc: true,
});

Creating a Custom Modelโ€‹

Define your custom model by extending the base Model class:

model.ts
import { BasicEvent, EventBasePayload, Model, Block } from '@easylayer/bitcoin-crawler';

// Define your custom event
export class YourCustomEvent<T extends EventBasePayload> extends BasicEvent<T> {};

// Create your model
export default class CustomModel extends Model {
address: string = '1A1zP1eP5QGefi2DMPTfTL5SLmv7DivfNa';
balance: string = '0';

constructor() {
super('uniq-model-id'); // This ID will be used to fetch events and state
}

public async parseBlock({ block }: { block: Block }) {
// Implement this method to process blocks
// Create custom events using this.apply(new YourCustomEvent(data))
}

private onYourCustomEvent({ payload }: YourCustomEvent) {
// Handle your custom event
// Update model state based on the event payload
// See examples in the repository for detailed implementations
}
}

Bootstrap Configurationโ€‹

The bootstrap function accepts the following configuration options:

interface BootstrapOptions {
Models: ModelType[]; // Array of your custom models
rpc?: boolean; // Enable RPC transport
ws?: boolean; // Enable WebSocket transport
tcp?: boolean; // Enable TCP transport
ipc?: boolean; // Enable IPC transport
}

You can enable multiple transports simultaneously and define multiple models for different business logic domains.

Transport API Reference

This document describes how clients can interact with the application via RPC, IPC, WS and TCP transports.


1. HTTP RPC (Queries Only)

The HTTP RPC transport allows clients to perform data retrieval queries using a standardized JSON-RPC-like protocol.

Connection Detailsโ€‹

POST `https://localhost:3000/`
Content-Type: application/json

Available Queriesโ€‹

The application provides two main query types:

  1. GetModels - Retrieve model states at a specific block height
  2. FetchEvents - Retrieve events with pagination and filtering

GetModels Queryโ€‹

Retrieves the current state of one or more models at a specified block height.

Request Formatโ€‹

{
"requestId": "uuid-1001",
"action": "query",
"payload": {
"constructorName": "GetModels",
"dto": {
"modelIds": ["model1", "model2"],
"filter": {
"blockHeight": 100
}
}
}
}

Parametersโ€‹

ParameterTypeRequiredDescription
modelIdsstring[]YesArray of model IDs to retrieve
filter.blockHeightnumberNoBlock height to get state at (defaults to latest)

Response Formatโ€‹

{
"requestId": "uuid-1001",
"action": "queryResponse",
"payload": [
{
"aggregateId": "model1",
"state": { /* model state */ }
},
{
"aggregateId": "model2",
"state": { /* model state */ }
}
]
}

FetchEvents Queryโ€‹

Retrieves events for one or more models with pagination and filtering options.

Request Formatโ€‹

{
"requestId": "uuid-1002",
"action": "query",
"payload": {
"constructorName": "FetchEvents",
"dto": {
"modelIds": ["model1"],
"filter": {
"blockHeight": 100
},
"paging": {
"limit": 10,
"offset": 0
}
}
}
}

Parametersโ€‹

ParameterTypeRequiredDescription
modelIdsstring[]YesArray of model IDs to fetch events for
filter.blockHeightnumberNoFilter events by block height
filter.versionnumberNoFilter events by version
paging.limitnumberNoNumber of events to return (default: 10)
paging.offsetnumberNoNumber of events to skip (default: 0)

Response Formatโ€‹

{
"requestId": "uuid-1002",
"action": "queryResponse",
"payload": {
"events": [
{
"aggregateId": "model1",
"version": 5,
"blockHeight": 100,
"data": { /* event data */ }
}
],
"total": 100
}
}

Error Handlingโ€‹

Both queries return errors in the following format:

{
"requestId": "uuid-1003",
"action": "error",
"payload": {
"error": {
"message": "Error description"
}
}
}

2. Event Streaming (WS, TCP, IPC)

The application supports real-time event streaming through multiple transport protocols. All transports implement the same event communication patterns and use the same query interfaces as HTTP RPC.

Event Communication Patternsโ€‹

1. Outgoing Events (Application โ†’ Client)โ€‹

ActionDescriptionPayload
eventSingle event{ constructorName: string; dto: any }
batchMultiple eventsArray<{ constructorName: string; dto: any }>
pingConnection checkundefined
errorError notification{ message: string }
queryResponseResponse to querySame as HTTP RPC responses

2. Incoming Events (Client โ†’ Application)โ€‹

ActionDescriptionPayload
pongResponse to pingundefined
queryQuery requestSame as HTTP RPC requests

Available Queriesโ€‹

All transports support the same queries as HTTP RPC:

  1. GetModels Query
{
"requestId": "uuid-1",
"action": "query",
"payload": {
"constructorName": "GetModels",
"dto": {
"modelIds": ["model1", "model2"],
"filter": {
"blockHeight": 100
}
}
}
}
  1. FetchEvents Query
{
"requestId": "uuid-2",
"action": "query",
"payload": {
"constructorName": "FetchEvents",
"dto": {
"modelIds": ["model1"],
"filter": {
"blockHeight": 100
},
"paging": {
"limit": 10,
"offset": 0
}
}
}
}

Connection Lifecycleโ€‹

  1. Client establishes connection
  2. Application sends ping events periodically
  3. Client must respond with pong to maintain connection
  4. After successful pong, application starts streaming events

Message Interfacesโ€‹

// Outgoing messages (Application โ†’ Client)
interface OutgoingMessage<A extends string = string, P = any> {
requestId?: string;
action: A;
payload?: P;
}

// Incoming messages (Client โ†’ Application)
interface IncomingMessage<A extends string = string, P = any> {
requestId: string;
action: A;
payload?: P;
}

Transport-Specific Detailsโ€‹


2.1 WebSocket

Connection URLโ€‹

ws://localhost:3000/events

2.2 TCP

Connection Detailsโ€‹

  • Host: localhost
  • Port: 4000

2.3 IPC

Connection Detailsโ€‹

IPC transport is only available when the application runs as a child process. The communication happens through Node.js child process IPC channel.

import { fork } from 'node:child_process';

// Start the application as a child process
const child = fork('./easylayer.js', [], {
stdio: ['inherit', 'inherit', 'inherit', 'ipc']
});

Configuration Referenceโ€‹

AppConfigโ€‹

PropertyTypeDescriptionDefaultRequired
NODE_ENVstringNode environment"development"โœ…
HTTP_HOSTstringHttp Server host
HTTP_PORTnumberHttp Server port3000โœ…
TCP_HOSTstringTcp Server host
TCP_PORTnumberTcp Server port4000โœ…

BlocksQueueConfigโ€‹

PropertyTypeDescriptionDefaultRequired
BITCOIN_CRAWLER_BLOCKS_QUEUE_LOADER_STRATEGY_NAMEstringLoader strategy name for the Bitcoin blocks queue."pull-network-provider"โœ…
BITCOIN_CRAWLER_BLOCKS_QUEUE_LOADER_CONCURRENCY_COUNTnumberConcurrency ัount of blocks download4โœ…

BusinessConfigโ€‹

PropertyTypeDescriptionDefaultRequired
BITCOIN_CRAWLER_MAX_BLOCK_HEIGHTnumberMaximum block height to be processed. Defaults to infinity.9007199254740991โœ…
BITCOIN_CRAWLER_START_BLOCK_HEIGHTnumberThe block height from which processing begins.0โœ…
BITCOIN_CRAWLER_ONE_BLOCK_SIZEnumberThe block size1048576โœ…

EventStoreConfigโ€‹

PropertyTypeDescriptionDefaultRequired
BITCOIN_CRAWLER_EVENTSTORE_DB_NAMEstringFor SQLite: folder path where the database file will be created; For Postgres: name of the database to connect to."resolve(process.cwd(), 'eventstore"โœ…
BITCOIN_CRAWLER_EVENTSTORE_DB_TYPEstringType of database for the eventstore."sqlite"โœ…
BITCOIN_CRAWLER_EVENTSTORE_DB_SYNCHRONIZEbooleanAutomatic synchronization that creates or updates tables and columns. Use with caution.trueโœ…
BITCOIN_CRAWLER_EVENTSTORE_DB_HOSTstringHost for the eventstore database connection.
BITCOIN_CRAWLER_EVENTSTORE_DB_PORTnumberPort for the eventstore database connection.
BITCOIN_CRAWLER_EVENTSTORE_DB_USERNAMEstringUsername for the eventstore database connection.
BITCOIN_CRAWLER_EVENTSTORE_DB_PASSWORDstringPassword for the eventstore database connection.

ProvidersConfigโ€‹

PropertyTypeDescriptionDefaultRequired
BITCOIN_CRAWLER_NETWORK_PROVIDER_SELF_NODE_URLstringURL of the user's own Bitcoin node. Format: http://username:password@host:port
BITCOIN_CRAWLER_NETWORK_PROVIDER_QUICK_NODE_URLSarrayMultiple QuickNode node URLs can be entered, separated by commas.