DSRCH Node Documentation
Complete technical reference for running, configuring, and operating a DSRCH decentralized search node.
Overview
DSRCH (Decentralized Search Ranking Consensus Hypernetwork) is a purpose-built Layer 1 blockchain for decentralized web search. Each node crawls the web, builds a local search index, serves queries, and earns DSRCH tokens through Proof-of-Index consensus.
| Component | Technology |
|---|---|
| Language | Go 1.26+ |
| Blockchain | Cosmos SDK |
| Networking | libp2p (GossipSub, DHT, mDNS) |
| Search | Bluge full-text engine |
| Consensus | Proof-of-Index |
| Config | YAML |
| Deployment | Docker |
Architecture
| Port | Protocol | Description |
|---|---|---|
8080 | HTTP/HTTPS | API, Explorer, Dashboard, Search UI, Docs |
9090 | TCP/UDP | libp2p P2P networking |
Quick Start
Option 1: From Binary
# Build from source
cd dsrch && go build -o dsrchd ./cmd/dsrchd
# Initialize node
./dsrchd init --mode full --port 9090
# Start node
./dsrchd start --api-port 8080 --epoch-duration 30s
Option 2: Docker
# Build and run
docker build -t dsrch-node .
docker run -d --name dsrch \
-p 8080:8080 -p 9090:9090 \
-v dsrch-data:/home/dsrch/.dsrch \
--restart unless-stopped \
dsrch-node start --api-port 8080 --epoch-duration 30s
Option 3: Docker Compose (3-node testnet)
docker compose up -d
CLI Reference
dsrch init
Initialize a new DSRCH node. Generates Ed25519 keypair and default configuration.
| Flag | Default | Description |
|---|---|---|
--mode | full | Node mode: light, full, validator |
--port | 9090 | P2P listen port |
Generated files:
~/.dsrch/
├── config.yaml # Node configuration
└── node.key # Ed25519 private key (keep secure!)
dsrch start
Start the full DSRCH node with all subsystems.
| Flag | Default | Description |
|---|---|---|
--api-port | from config | HTTP API port |
--epoch-duration | 1h | Epoch duration (e.g. 30s for testnet) |
--bootstrap | built-in seeds | Bootstrap peer multiaddrs |
--crawl-seed | — | Seed URLs to auto-crawl on startup |
--no-seeds | false | Disable built-in seed nodes |
--external-ip | — | Public IP for Docker/NAT |
--tls-cert | — | TLS certificate path |
--tls-key | — | TLS private key path |
dsrch status
Display current node status and document count.
Configuration
Configuration is stored in ~/.dsrch/config.yaml:
node:
mode: "full" # light | full | validator
dataDir: "/home/user/.dsrch"
apiPort: 8080
p2p:
listenPort: 9090
bootstrapPeers: []
enableMDNS: true
enableDHT: true
externalIP: "" # Public IP for NAT/Docker
Node Modes
| Mode | Crawls | Indexes | Serves Queries | Earns Tokens |
|---|---|---|---|---|
light | No | No | Relay only | No |
full | Yes | Yes | Yes | Yes |
validator | Yes | Yes | Yes | Yes (higher) |
Data Directory
~/.dsrch/
├── config.yaml # Configuration
├── node.key # Ed25519 private key
├── index/ # Bluge full-text index
├── state.json # Blockchain state
└── query.log # Search query log
API Reference
Base URL: https://api.dsrch.net · Full interactive docs: api.dsrch.net
Search
Search the decentralized index.
| Parameter | Type | Description |
|---|---|---|
q | string | Search query (required) |
limit | int | Max results (default: 10, max: 100) |
curl "https://api.dsrch.net/api/search?q=ukraine&limit=5"
Node Status
Returns node ID, status, document count, version, and chain ID.
{
"nodeId": "12D3KooW...",
"status": "running",
"documents": 2921,
"version": "329.11.81",
"chainId": "dsrch-mainnet-1"
}
Network
List connected P2P peers.
Aggregated network statistics across all peers.
Connect to a specific peer by multiaddr.
Epoch & Consensus
Current epoch number, activity tracker, and work report data.
Token reward history for this node.
Staking
Stake DSRCH tokens. Params: nodeId, amount
Unstake tokens (subject to unbonding period). Params: nodeId, amount
Governance
List all governance proposals.
Submit a proposal. Params: title, description, type, proposer
Vote on a proposal. Params: proposalId, voter, vote (yes/no/abstain/veto)
Crawler
Add URL to crawl queue. Params: url, depth, maxPages
curl -X POST "https://api.dsrch.net/api/crawl/add?url=https://example.com&depth=3&maxPages=200"
Crawler queue status and auto-seeder statistics.
Webmaster Tools
Submit URLs, verify domain ownership, view indexing statistics.
Other Endpoints
| Endpoint | Method | Description |
|---|---|---|
/api/suggest | GET | Autocomplete suggestions |
/api/analytics | GET | Search analytics |
/api/faucet | POST | Request 100 testnet DSRCH tokens |
/api/shards | GET | Shard distribution |
/api/keys | GET/POST | API key management |
/metrics | GET | Prometheus metrics |
/openapi.json | GET | OpenAPI specification |
P2P Networking
DSRCH uses libp2p for all peer-to-peer communication.
Protocols
| Protocol | Description |
|---|---|
/dsrch/search/1.0.0 | Distributed search queries |
/dsrch/crawl/1.0.0 | Crawl task delegation |
/dsrch/sync/1.0.0 | Index replication |
GossipSub dsrch/v1 | Work report broadcasting |
GossipSub dsrch/crawl/v1 | Crawl announcements |
Discovery
- DHT — Kademlia distributed hash table for global discovery
- mDNS — Local network discovery (development/LAN)
- Bootstrap peers — Hardcoded seed nodes for initial connectivity
- Periodic discovery — Runs every 30 seconds
Crawler Engine
Components
| Component | Description |
|---|---|
Crawler | Core HTTP crawler with rate limiting |
Frontier | Priority queue for URL scheduling |
Parser | HTML parser (text, links, meta extraction) |
DomainFilter | Allow/blocklist for domains |
RobotsChecker | robots.txt compliance |
SimHash | Near-duplicate detection |
Scheduler | Job queue with auto-seeder integration |
AutoSeeder | Continuous URL seed provider (56+ sources) |
DomainAssigner | Consistent hashing for domain distribution |
CrawlCoordinator | Distributed crawl task routing |
Auto-Seeder
Ensures continuous indexing by providing new seed URLs when the queue is empty:
- 56+ curated seeds: Wikipedia (UK, EN, DE, FR, ES, PL, JA), BBC, DW, Reuters, Ukrainian media, tech sites, science journals, blockchain resources
- Link discovery: External links from crawled pages become new seeds automatically
- Domain deduplication: Each domain crawled once to prevent re-crawling
- Blocklist integration: Russian, Chinese, NK, Iranian, Belarus state media blocked
Crawl Parameters
| Parameter | Default | Description |
|---|---|---|
| Max depth | 2 | Link following depth from seed |
| Max pages/job | 50 | Pages per crawl session |
| Delay | 300ms | Per-domain rate limit |
| Timeout/job | 5 min | Max crawl duration |
| Max body | 5 MB | Per-page size limit |
| Max redirects | 3 | Redirect chain limit |
| User-Agent | DSRCHBot/1.0 | Crawler identification |
Search & Indexing
- Bluge — High-performance Go full-text search engine
- Scatter-Gather — Distributed search across shard nodes with consistent hash routing
- Index Replication — Manifest-based sync every 2 minutes + event-triggered sync
- Autocomplete — Seeded from indexed page titles
Consensus: Proof-of-Index
DSRCH uses a custom Proof-of-Index consensus mechanism.
Epochs
Default duration: 1 hour (configurable, 30s for testnet). At epoch end, each node generates a Work Report:
{
"nodeId": "12D3KooW...",
"epoch": 42,
"pageCount": 150,
"queriesServed": 89,
"avgLatency": "15ms",
"shardsStored": 3,
"crawlMerkle": "a1b2c3d4..."
}
Validation Flow
- Node crawls pages and serves queries throughout the epoch
- Activity tracked by
ActivityTracker - At epoch end, work report generated and broadcast via GossipSub
DSRCHApp.ProcessEpochWithValidationvalidates all reports- Valid nodes receive DSRCH token rewards
- State persisted to disk
Blockchain Modules
| Module | Description |
|---|---|
x/srchtoken | Native DSRCH token — minting, transfers, supply |
x/noderegistry | Node registration, types, status tracking |
x/rewards | Epoch reward calculation and distribution |
x/slashing | Penalty mechanisms for misbehaving nodes |
x/gov | On-chain governance — proposals, voting, tallying |
x/indexregistry | Tracks what each node has indexed |
x/crawlproof | Cryptographic proofs of crawl work |
x/queryproof | Cryptographic proofs of query serving |
Governance
DSRCH has on-chain governance for protocol upgrades and parameter changes.
- Only staked nodes can vote
- Voting options:
yes,no,abstain,veto - Voting power proportional to stake
- Lifecycle: submitted → quorum → voting → tally → executed
Staking & Rewards
Rewards are distributed based on:
- Pages indexed — crawl work
- Queries served — utility
- Uptime — reliability
- Shard storage — data availability
curl -X POST "https://api.dsrch.net/api/faucet?address=dsrch1..." — get 100 free DSRCH tokens.Docker Deployment
# Single node
docker build -t dsrch-node .
docker run -d --name dsrch \
-p 8080:8080 -p 9090:9090 \
-v dsrch-data:/home/dsrch/.dsrch \
--restart unless-stopped \
dsrch-node start --api-port 8080 --external-ip YOUR_PUBLIC_IP
# 3-node testnet
docker compose up -d
/api/status every 30 seconds.Production Setup
Recommended
- Reverse proxy: Caddy or Nginx for TLS termination
- Firewall: Open ports 8080 (HTTP) and 9090 (P2P, TCP+UDP)
- Resources: 2+ vCPU, 4+ GB RAM, 50+ GB SSD
Caddy Config
explorer.dsrch.net {
reverse_proxy localhost:8080
}
api.dsrch.net {
reverse_proxy localhost:8080
}
docs.dsrch.net {
reverse_proxy localhost:8080
}
Monitoring
- Prometheus metrics at
/metrics: request counts, latency histograms, index size - Dashboard at
/dashboard: node status, peers, epochs, rewards - Explorer at
/explorer: network overview, all nodes, search
Security
- Private key:
~/.dsrch/node.key(permissions: 0600) — never share - Rate limiting: Built-in per-IP rate limiter on all endpoints
- CORS: Configurable origin whitelist
- CSP headers: Content Security Policy on all pages
- TLS:
--tls-certand--tls-keyflags - robots.txt: Crawler respects Robots Exclusion Protocol
node.key — it controls your node identity and staked tokens.Troubleshooting
Node won't start
Run dsrch init first to generate config and keypair.
No peers connecting
- Check firewall: port 9090 must be open (TCP + UDP)
- If behind NAT/Docker: use
--external-ip YOUR_PUBLIC_IP - Verify bootstrap peers are reachable
Crawl queue empty
The auto-seeder provides URLs automatically. If still stopped, manually seed:
curl -X POST "https://api.dsrch.net/api/crawl/add?url=https://example.com"
Search returns no results
Check document count with curl https://api.dsrch.net/api/status. If 0, wait for crawler to index pages.