Nvidia & Crypto Titans: AI’s New Dawn!

A decentralized AI experiment once confined to crypto circles just earned a public nod from Nvidia CEO Jensen Huang, signaling that distributed model training may be inching closer to the mainstream.

Open Source AI Momentum Builds With Nvidia CEO Endorsement

Chamath Palihapitiya spotlighted Bittensor’s Covenant-72B during an episode of the All-In Podcast, framing it as a tangible example of decentralized artificial intelligence (AI) moving beyond theory. Bittensor operates as a decentralized, blockchain-driven network that establishes a peer-to-peer marketplace in which machine learning models and AI compute are exchanged and incentivized.

Palihapitiya described the effort in plain terms: a large-scale language model (LLM) trained without centralized infrastructure, powered instead by a network of independent contributors. “They managed to train a 4 billion parameter LLaMA model, totally distributed, with a bunch of people contributing excess compute,” he said, calling it “a pretty crazy technical accomplishment.”

The comparison landed with a familiar analogy. “There are random people, and each person gets a little share,” Palihapitiya added, referencing the early distributed computing project that harnessed idle hardware worldwide.

Huang did not dismiss the idea. Instead, he leaned into a broader framing of the AI market, suggesting that decentralized and proprietary approaches are not mutually exclusive. “These two things are not A or B; it’s A and B,” Huang said. “There is no question about it.”

//simplytao.ai/blog/covenant-72b-the-largest-decentralized-llm-training-run”>one of the largest

decentralized training runs to date, coordinating more than 70 contributors across standard internet connections without a central authority.

Technically, the model pushes boundaries. Built with 72 billion parameters and trained on roughly 1.1 trillion tokens, it leverages innovations such as compressed communication protocols and distributed data parallelism to make training viable outside traditional data centers.

Performance metrics suggest it is not merely experimental. Benchmark results place it in competition with established centralized models, a detail that helps explain why the project has drawn attention beyond crypto-native audiences.

The market noticed as well. Following the announcement, the project’s token TAO has risen 24% since the video of Palihapitiya and Huang made its rounds on social media.

Still, Huang’s comments suggest the real story is not disruption, but coexistence between the two. Proprietary AI systems will likely remain dominant for general users, while open and decentralized models carve out roles in specialized, cost-sensitive, or sovereignty-driven applications.

For startups, the Nvidia CEO outlined a pragmatic playbook: start open, then layer in proprietary advantages. “Every startup we’re investing in now is open source first, and then going to the proprietary model,” he said.

In other words, the future of AI may not belong to a single architecture or philosophy. It may belong to those who can navigate both-and know when to use each.

FAQ 🔎

  • What is Bittensor’s Covenant-72B?
    A 72 billion-parameter language model trained through a decentralized network of contributors without centralized infrastructure.
  • What did Jensen Huang say about decentralized AI?
    He said open and proprietary AI models will coexist, describing the relationship as “A and B,” not a choice between them.
  • Why is this development important?
    It shows large-scale AI models can be trained outside traditional data centers, challenging assumptions about infrastructure needs.
  • How does this affect the AI industry?
    It supports a hybrid future where centralized platforms and decentralized models serve different roles across industries.

Read More

2026-03-20 02:57