In an exclusive conversation with blockchain industry observers, jacob robert steeves, the creator of Bittensor, shared his perspective on applying Bitcoin-style mining mechanisms to artificial intelligence—a novel approach that’s redefining how distributed computing resources are coordinated in the digital age. This interview traces steeves’ remarkable journey from software engineering at Google to pioneering one of crypto’s most ambitious AI infrastructure projects.
Bittensor (TAO), as an open-source protocol, has carved a distinctive niche by applying economic incentives to AI computation. The network operates approximately 128 subnets, each competing across domains including inference, training, reinforcement learning, and data services. According to the latest market data as of March 2026, TAO is trading at $182.60 with a flowing market capitalization of approximately $1.75 billion. The ecosystem has evolved significantly since its mainnet launch in 2021, attracting participation from developers and computing resources worldwide.
The Journey from Google to Decentralized AI Architecture
jacob robert steeves’ path to founding Bittensor began at Simon Fraser University in Vancouver, Canada, where he studied mathematics and computer science. His early career took him through work on brain-computer interface chips at a DARPA contractor firm, where a pivotal mentorship introduced him to Bitcoin and concepts of energy-based computation. “Since 2015, I have been deeply involved in both Bitcoin and AI,” steeves explained. “These two fields are naturally compatible because the core of AI is the study of feedback loops—backpropagation, genetic algorithms, reinforcement learning—while Bitcoin represents the first programmable economic feedback loop.”
His tenure at Google proved transformative. Working as a machine learning engineer, steeves witnessed the publication of the groundbreaking “Attention Is All You Need” paper introducing Transformers, which catalyzed the exponential development of large language models. He absorbed critical knowledge from frontline teams regarding distributed machine learning practices—parameter servers, model parallelism, and data parallelism techniques—that would later inform Bittensor’s computational architecture.
Despite the prestige of working at a tech giant, steeves chose to pursue his vision independently. Beginning in his spare time around 2015, he developed Bittensor’s foundational concepts before committing full-time in 2018 and launching the mainnet in 2021. “The experience at Google taught me distributed systems and the scale required to run effective machine learning at global scale,” he reflected. “But the fundamental difference between Bittensor and traditional enterprise AI is our philosophical approach to resource coordination.”
At its essence, Bittensor transforms how computing resources are organized and compensated. Unlike traditional aggregation platforms that simply “stack models together,” the protocol embeds programmable economic incentives directly into the AI learning process. “Whoever provides more useful inference, training, or tools receives more rewards,” steeves emphasized. “This is completely different from basic model stacking.”
The breakthrough lies in recognizing what the past 15 years of AI advancement has proven: that adaptive learning through feedback mechanisms—whether backpropagation or reinforcement learning—drives progress. Bittensor operationalizes this principle by directly integrating currency and economic signals into the AI development cycle. Market forces continuously optimize supply quality and cost-effectiveness.
“The significance of decentralization,” steeves clarified, “is permissionless entry and resistance to single points of failure. Any individual or team can launch a subnet and compete. Good supply is amplified by incentives; poor supply is naturally eliminated.” This competitive environment ensures continuous improvement without requiring centralized gatekeepers.
From a user perspective, the model operates on multiple levels. Developers can initiate or join subnets, contribute models and computing power, and receive ongoing incentives tied to their performance. Demand-side participants purchase services including inference, computing power, automated machine learning, or market prediction signals. The platform transforms the traditional “miner—reward—consensus” paradigm into “useful AI supply—market reward—network consensus.”
Chinese Teams Lead Bittensor’s Subnet Innovation
When asked about his first visit to China specifically to discuss Bittensor, steeves highlighted the strategic importance of Asian developer participation. “China is one of the fastest-growing, possibly the most powerful, countries in the global artificial intelligence field,” he noted. “When Bitcoin mining was legal, China accounted for over 50% of the computing power. Even today, the region produces 90% of the world’s chips. I have great respect for China’s technical strength.”
More significantly, steeves observed a competitive pattern within Bittensor’s ecosystem: “In Bittensor, there’s a saying that once Chinese miners enter a subnet, competition immediately becomes much fiercer, to the point that many original participants exit. This is entirely expected—the intensity of competition in China is truly astonishing.” He views this positively, suggesting that China’s rigorous training culture in universities and engineering excellence align naturally with Bittensor’s competitive-merit framework.
The concrete evidence of Chinese contribution is visible in Bittensor’s top subnet projects. Affine, one of the largest subnets on the network, was built by Chinese developers and has become one of the most competitive mechanisms on the platform. Simultaneously, Lium, a GPU resource-focused subnet, has integrated significant computing power from Asian sources. Through these projects, Chinese miners are contributing processor resources to a global marketplace while simultaneously gaining access to international computing demand.
“The engineering level here is extremely high, almost second to none,” steeves said of the Chinese developer community. “I want to facilitate more such teams joining, because their contribution to network capacity and competitive quality is invaluable.”
Decentralization’s Real Power: Beyond Aggregation
A common misconception frames Bittensor as merely an “AI model aggregator,” but steeves was emphatic in correcting this characterization. “The core of Bittensor is embedding programmable incentives into the AI learning process—it’s fundamentally different from just stacking models together,” he insisted.
The distinction between Bittensor and traditional platforms extends beyond architecture to philosophy. “The so-called Crypto + AI is just applying cryptocurrency to AI or applying AI to crypto, which doesn’t touch the core of what we’re doing,” steeves explained. “What we’re actually doing is using crypto-economic incentives to conduct artificial intelligence research. It’s not decentralization for decentralization’s sake—it’s using market signals and competition to scale useful computation.”
The resilience of this approach became apparent when AWS experienced a large-scale outage in late 2024, causing many centralized AI services to fail. Bittensor’s distributed architecture meant it continued operating without interruption. “This incident proves one of the values of decentralization—it provides resilience against single points of failure,” steeves noted. “However, it also proved that many so-called decentralized ecosystems are not truly decentralized, as some projects could not recover after the outage. Bittensor’s fundamental design, built on resource distribution and routing flexibility, gives us advantages in continuity and fault tolerance.”
TAO’s Market Position and Five-Year Roadmap
Since its exchange listing in March 2023, TAO has evolved into a significant player in the crypto-economic infrastructure space. The token ecosystem recently demonstrated strong institutional confidence with TAO Treasury completing an $11 million private fundraising round, attracting investors including strategic advisor James Altucher and Grayscale’s parent company DCG.
Regarding the 2025 halving cycle—Bittensor’s first supply reduction event—steeves expressed measured perspective: “The only impact of the halving on Bittensor is that supply will tighten. But this will not affect the network’s fundamental incentive mechanism. There will still be huge economic incentives to encourage developers to build on the platform.” This indicates his confidence in the underlying economic model’s sustainability beyond simple token scarcity mechanisms.
The protocol’s revenue streams are diversifying. Primary sources include selling inference services, computing power, automated machine learning services (AutoML), and market prediction signals. This multi-revenue approach mirrors traditional technology platforms while maintaining decentralized governance.
Regarding prediction markets specifically, steeves highlighted their transformative potential. “I think Kalshi and Polymarket are among the real fintech applications and the first applications for mass consumers,” he said. “It’s very meaningful and profoundly changes the way humans work. Bittensor’s prediction market subnets represent the next frontier in decentralized decision-making infrastructure.”
The Five-Year Vision: Scaling to Millions
When envisioning Bittensor’s future, steeves articulated an ambitious yet measurable objective: bringing the technology to millions of users and genuinely providing open intelligent services globally while maintaining sustainable network operation.
“The headline I most want to see is: we have brought this technology to ‘millions’ of users and truly provided open intelligent services to the world, with the network continuously expanding,” he stated. With approximately 100,000 users currently utilizing Bittensor technology, the path to scale appears technically viable.
The economic advantage provides the primary growth driver. “Economically, we can beat centralized providers in many scenarios with cost advantages, especially in inference,” steeves explained. Consider the competitive dynamics: centralized AI products might charge $1,000 subscriptions while delivering only $200 of actual value. Bittensor can offer $10 subscriptions with network costs of approximately $6—a 60% cost efficiency advantage.
Ridges, a large subnet focused on coding agents, exemplifies this principle. Teams worldwide collectively optimize coding assistance through competitive incentives, driving prices downward while improving quality. The same economic mechanics apply across domains.
“Our goal is to serve billions of users worldwide,” steeves articulated. “If centralized AI companies do not adopt these underlying technical primitives, it will be difficult for them to keep up in performance, speed, and cost in the long run. This is our fundamental wedge.”
The parallel to Bitcoin’s success is deliberate. “The reason Bitcoin can outperform sovereign states or centralized systems at the network level is because it adopted the right technical primitives and mechanism design,” he noted. While acknowledging that Bittensor hasn’t achieved this universally across all domains, steeves emphasized that in specific areas—particularly GPU inference and prediction markets—the network already demonstrates these superiorities.
Interestingly, steeves noted that many users already benefit from Bittensor’s infrastructure without direct awareness. “Many people actually use Bittensor in their daily lives without even knowing it,” he suggested, indicating that the technology is functioning as underlying infrastructure supporting applications and services at higher layers.
The cooperation potential with major AI institutions represents another dimension of growth. “Yes, it’s possible,” steeves said regarding collaboration with OpenAI or Chinese AI companies. “It depends on whether our philosophies align. Some centralized labs prefer to consolidate and control, while we emphasize openness and permissionlessness.” He expressed particular enthusiasm about collaborations with open-minded teams like DeepSeek, Kimi, and Moonshot. “If we can work with them to achieve truly decentralized training, we would very much welcome that. It’s only a matter of time: either cooperate or adopt our decentralized training approach.”
This vision—of economic incentives driving global-scale artificial intelligence development through distributed networks—represents jacob robert steeves’ fundamental contribution to understanding how markets, competition, and economic mechanisms can scale beneficial technology to serve humanity at unprecedented scale.
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
jacob robert steeves Unveils Bittensor's AI Mining Vision: Bridging Economics and Artificial Intelligence
In an exclusive conversation with blockchain industry observers, jacob robert steeves, the creator of Bittensor, shared his perspective on applying Bitcoin-style mining mechanisms to artificial intelligence—a novel approach that’s redefining how distributed computing resources are coordinated in the digital age. This interview traces steeves’ remarkable journey from software engineering at Google to pioneering one of crypto’s most ambitious AI infrastructure projects.
Bittensor (TAO), as an open-source protocol, has carved a distinctive niche by applying economic incentives to AI computation. The network operates approximately 128 subnets, each competing across domains including inference, training, reinforcement learning, and data services. According to the latest market data as of March 2026, TAO is trading at $182.60 with a flowing market capitalization of approximately $1.75 billion. The ecosystem has evolved significantly since its mainnet launch in 2021, attracting participation from developers and computing resources worldwide.
The Journey from Google to Decentralized AI Architecture
jacob robert steeves’ path to founding Bittensor began at Simon Fraser University in Vancouver, Canada, where he studied mathematics and computer science. His early career took him through work on brain-computer interface chips at a DARPA contractor firm, where a pivotal mentorship introduced him to Bitcoin and concepts of energy-based computation. “Since 2015, I have been deeply involved in both Bitcoin and AI,” steeves explained. “These two fields are naturally compatible because the core of AI is the study of feedback loops—backpropagation, genetic algorithms, reinforcement learning—while Bitcoin represents the first programmable economic feedback loop.”
His tenure at Google proved transformative. Working as a machine learning engineer, steeves witnessed the publication of the groundbreaking “Attention Is All You Need” paper introducing Transformers, which catalyzed the exponential development of large language models. He absorbed critical knowledge from frontline teams regarding distributed machine learning practices—parameter servers, model parallelism, and data parallelism techniques—that would later inform Bittensor’s computational architecture.
Despite the prestige of working at a tech giant, steeves chose to pursue his vision independently. Beginning in his spare time around 2015, he developed Bittensor’s foundational concepts before committing full-time in 2018 and launching the mainnet in 2021. “The experience at Google taught me distributed systems and the scale required to run effective machine learning at global scale,” he reflected. “But the fundamental difference between Bittensor and traditional enterprise AI is our philosophical approach to resource coordination.”
Bittensor’s Economic Model: Mining Meets Machine Learning
At its essence, Bittensor transforms how computing resources are organized and compensated. Unlike traditional aggregation platforms that simply “stack models together,” the protocol embeds programmable economic incentives directly into the AI learning process. “Whoever provides more useful inference, training, or tools receives more rewards,” steeves emphasized. “This is completely different from basic model stacking.”
The breakthrough lies in recognizing what the past 15 years of AI advancement has proven: that adaptive learning through feedback mechanisms—whether backpropagation or reinforcement learning—drives progress. Bittensor operationalizes this principle by directly integrating currency and economic signals into the AI development cycle. Market forces continuously optimize supply quality and cost-effectiveness.
“The significance of decentralization,” steeves clarified, “is permissionless entry and resistance to single points of failure. Any individual or team can launch a subnet and compete. Good supply is amplified by incentives; poor supply is naturally eliminated.” This competitive environment ensures continuous improvement without requiring centralized gatekeepers.
From a user perspective, the model operates on multiple levels. Developers can initiate or join subnets, contribute models and computing power, and receive ongoing incentives tied to their performance. Demand-side participants purchase services including inference, computing power, automated machine learning, or market prediction signals. The platform transforms the traditional “miner—reward—consensus” paradigm into “useful AI supply—market reward—network consensus.”
Chinese Teams Lead Bittensor’s Subnet Innovation
When asked about his first visit to China specifically to discuss Bittensor, steeves highlighted the strategic importance of Asian developer participation. “China is one of the fastest-growing, possibly the most powerful, countries in the global artificial intelligence field,” he noted. “When Bitcoin mining was legal, China accounted for over 50% of the computing power. Even today, the region produces 90% of the world’s chips. I have great respect for China’s technical strength.”
More significantly, steeves observed a competitive pattern within Bittensor’s ecosystem: “In Bittensor, there’s a saying that once Chinese miners enter a subnet, competition immediately becomes much fiercer, to the point that many original participants exit. This is entirely expected—the intensity of competition in China is truly astonishing.” He views this positively, suggesting that China’s rigorous training culture in universities and engineering excellence align naturally with Bittensor’s competitive-merit framework.
The concrete evidence of Chinese contribution is visible in Bittensor’s top subnet projects. Affine, one of the largest subnets on the network, was built by Chinese developers and has become one of the most competitive mechanisms on the platform. Simultaneously, Lium, a GPU resource-focused subnet, has integrated significant computing power from Asian sources. Through these projects, Chinese miners are contributing processor resources to a global marketplace while simultaneously gaining access to international computing demand.
“The engineering level here is extremely high, almost second to none,” steeves said of the Chinese developer community. “I want to facilitate more such teams joining, because their contribution to network capacity and competitive quality is invaluable.”
Decentralization’s Real Power: Beyond Aggregation
A common misconception frames Bittensor as merely an “AI model aggregator,” but steeves was emphatic in correcting this characterization. “The core of Bittensor is embedding programmable incentives into the AI learning process—it’s fundamentally different from just stacking models together,” he insisted.
The distinction between Bittensor and traditional platforms extends beyond architecture to philosophy. “The so-called Crypto + AI is just applying cryptocurrency to AI or applying AI to crypto, which doesn’t touch the core of what we’re doing,” steeves explained. “What we’re actually doing is using crypto-economic incentives to conduct artificial intelligence research. It’s not decentralization for decentralization’s sake—it’s using market signals and competition to scale useful computation.”
The resilience of this approach became apparent when AWS experienced a large-scale outage in late 2024, causing many centralized AI services to fail. Bittensor’s distributed architecture meant it continued operating without interruption. “This incident proves one of the values of decentralization—it provides resilience against single points of failure,” steeves noted. “However, it also proved that many so-called decentralized ecosystems are not truly decentralized, as some projects could not recover after the outage. Bittensor’s fundamental design, built on resource distribution and routing flexibility, gives us advantages in continuity and fault tolerance.”
TAO’s Market Position and Five-Year Roadmap
Since its exchange listing in March 2023, TAO has evolved into a significant player in the crypto-economic infrastructure space. The token ecosystem recently demonstrated strong institutional confidence with TAO Treasury completing an $11 million private fundraising round, attracting investors including strategic advisor James Altucher and Grayscale’s parent company DCG.
Regarding the 2025 halving cycle—Bittensor’s first supply reduction event—steeves expressed measured perspective: “The only impact of the halving on Bittensor is that supply will tighten. But this will not affect the network’s fundamental incentive mechanism. There will still be huge economic incentives to encourage developers to build on the platform.” This indicates his confidence in the underlying economic model’s sustainability beyond simple token scarcity mechanisms.
The protocol’s revenue streams are diversifying. Primary sources include selling inference services, computing power, automated machine learning services (AutoML), and market prediction signals. This multi-revenue approach mirrors traditional technology platforms while maintaining decentralized governance.
Regarding prediction markets specifically, steeves highlighted their transformative potential. “I think Kalshi and Polymarket are among the real fintech applications and the first applications for mass consumers,” he said. “It’s very meaningful and profoundly changes the way humans work. Bittensor’s prediction market subnets represent the next frontier in decentralized decision-making infrastructure.”
The Five-Year Vision: Scaling to Millions
When envisioning Bittensor’s future, steeves articulated an ambitious yet measurable objective: bringing the technology to millions of users and genuinely providing open intelligent services globally while maintaining sustainable network operation.
“The headline I most want to see is: we have brought this technology to ‘millions’ of users and truly provided open intelligent services to the world, with the network continuously expanding,” he stated. With approximately 100,000 users currently utilizing Bittensor technology, the path to scale appears technically viable.
The economic advantage provides the primary growth driver. “Economically, we can beat centralized providers in many scenarios with cost advantages, especially in inference,” steeves explained. Consider the competitive dynamics: centralized AI products might charge $1,000 subscriptions while delivering only $200 of actual value. Bittensor can offer $10 subscriptions with network costs of approximately $6—a 60% cost efficiency advantage.
Ridges, a large subnet focused on coding agents, exemplifies this principle. Teams worldwide collectively optimize coding assistance through competitive incentives, driving prices downward while improving quality. The same economic mechanics apply across domains.
“Our goal is to serve billions of users worldwide,” steeves articulated. “If centralized AI companies do not adopt these underlying technical primitives, it will be difficult for them to keep up in performance, speed, and cost in the long run. This is our fundamental wedge.”
The parallel to Bitcoin’s success is deliberate. “The reason Bitcoin can outperform sovereign states or centralized systems at the network level is because it adopted the right technical primitives and mechanism design,” he noted. While acknowledging that Bittensor hasn’t achieved this universally across all domains, steeves emphasized that in specific areas—particularly GPU inference and prediction markets—the network already demonstrates these superiorities.
Interestingly, steeves noted that many users already benefit from Bittensor’s infrastructure without direct awareness. “Many people actually use Bittensor in their daily lives without even knowing it,” he suggested, indicating that the technology is functioning as underlying infrastructure supporting applications and services at higher layers.
The cooperation potential with major AI institutions represents another dimension of growth. “Yes, it’s possible,” steeves said regarding collaboration with OpenAI or Chinese AI companies. “It depends on whether our philosophies align. Some centralized labs prefer to consolidate and control, while we emphasize openness and permissionlessness.” He expressed particular enthusiasm about collaborations with open-minded teams like DeepSeek, Kimi, and Moonshot. “If we can work with them to achieve truly decentralized training, we would very much welcome that. It’s only a matter of time: either cooperate or adopt our decentralized training approach.”
This vision—of economic incentives driving global-scale artificial intelligence development through distributed networks—represents jacob robert steeves’ fundamental contribution to understanding how markets, competition, and economic mechanisms can scale beneficial technology to serve humanity at unprecedented scale.