TTL Subnets: 129
$TAO Price: $283.04
Total Supply: 21M
Circulating Supply: 10M
24h Vol: 105.074M
Dominance: BTC: 0.085%
ם

Distributed Training

Subnet ID: 38

Coldkey:5EFJ7mEVyjxT3iXqxuHyMMuGgJKFt85tb5s4vEvnCTpSoP3w

Distributed Training

Distributed Training subnet enables decentralized training of large language models using the hivemind library for peer-to-peer gradient sharing and all-reduce operations. Miners contribute GPU resources to collaborative training runs, with model checkpoints stored on Cloudflare R2. The subnet supports multi-GPU configurations per miner with automatic updates and WandB integration for monitoring training progress.

Price
TAO0.0024
Validators
3
Emission
0.24%
Miners
250
Immune from Deregistration
Protected for 114 more days
EMA Price0.000000 TAO
GitHub Contribution Activity
263 contributions in the last year
Feb
Mar
Apr
May
Jun
Jul
Aug
Sep
Oct
Nov
Dec
Jan
Mon
Wed
Fri
Less
More

Yuma Pulse™

Key Features

Hivemind Integration

Peer-to-peer gradient synchronization using hivemind library enabling distributed training across decentralized miners

Multi-GPU Support

Configure multiple GPUs per miner node with nproc_per_node parameter for maximum training throughput

R2 Checkpoint Storage

Model states saved to Cloudflare R2 buckets with format {model_name}-{uid} for reliable checkpointing

Training Monitoring

WandB and HuggingFace integration for tracking training metrics, logging, and model publishing