TTL Subnets: 129
Total TAO: $8.909M
24h Vol: 89.675M
Dominance: BTC: 0.092%
ם

Distributed Training

Subnet ID: 38

Coldkey:5EFJ7mEVyjxT3iXqxuHyMMuGgJKFt85tb5s4vEvnCTpSoP3w

Distributed Training

Our proposed solution is a subnetwork that incentivises Compute, Bandwidth and Latency. The compute helps power the training of a miner’s local version of a model and the bandwidth and latency helps power the averaging of each miners local model weights using an operation called butterfly all-reduce. Once this process is successfully completed, each miner has a unified global averaged gradient that it can use to update it’s model weights.

Price
TAO0.007
Validators
4
Emission
0.49%
Miners
224
GitHub Contribution Activity
424 contributions in the last year
Sep
Oct
Nov
Dec
Jan
Feb
Mar
Apr
May
Jun
Jul
Aug
Mon
Wed
Fri
Less
More

Yuma Pulse™

Key Features

No key features available.