Covenant-72B, a large language model with 72 billion parameters incubated by the Bittensor decentralized network, has recently received public commendation from renowned venture capitalist Chamath Palihapitiya and NVIDIA CEO Jensen Huang. The model was collaboratively built by over 70 independent contributors, operating entirely without the need for centralized server support. Following the news, Bittensor's native token, TAO, surged 24%, capturing market attention.
Covenant-72B: 72 Billion Parameters, Zero Centralized Infrastructure
Covenant-72B was built on Bittensor's Subnet 3 (codenamed Templar) and trained using approximately 1.1 trillion training tokens. The model's assembly brought together over 70 global contributors who integrated computing resources through the Bittensor protocol, with no centralized servers coordinating the entire training process.
Its uniqueness lies not only in its scale but also in its novel production method. Unlike companies like Meta, Google, and OpenAI, which rely on vast proprietary data centers for centralized training, Covenant-72B was trained on a distributed network of independent participants. Each participant contributed idle computing power, and no single entity could control the entire training workflow.

Chamath Calls it "Crazy," Huang Discusses Coexistence of Open and Closed AI
Venture capitalist Chamath Palihapitiya mentioned Bittensor on the All-In Podcast, praising its distributed training approach as a "pretty incredible technical achievement."
"They managed to train a 4-billion parameter LLaMA model fully distributed, with a bunch of people contributing their spare compute."
It's important to clarify that Chamath's quote referred to an earlier milestone involving a 4-billion parameter model within Bittensor. The highly publicized Covenant-72B, with its massive 72-billion parameters, represents the project's latest and most significant training achievement.

NVIDIA CEO Jensen Huang's comments carry deeper strategic implications, positioning the relationship between open and closed AI as one of coexistence rather than competition. "It's not an either/or, it's a both/and, there's no question about that."
Huang further elaborated that AI models are "a technology, not a product." He pointed out that industries with deep domain expertise require open models to capture and control relevant knowledge. He also mentioned, "Every startup we're investing in today is going to start with open source and then eventually move to closed source."
It is noteworthy that Huang's comments were not specifically about Bittensor. His remarks encompassed the broader ecosystem of open-source and decentralized AI, with the original article contextualizing them within Bittensor's milestone achievement.
TAO Soars 24% Amidst "Extreme Fear" Market, Trading Volume Hits $406 Million
Following the widespread circulation of videos featuring Palihapitiya's and Huang's comments on social media, the price of the TAO token immediately jumped 24%. As of press time, the token is trading at $284.73, with a market capitalization of approximately $2.73 billion, ranking 35th in cryptocurrency rankings.
Its 24-hour trading volume reached $405.95 million, resulting in a volume-to-market cap ratio of approximately 15%.

