Author: PonderingDurian, Researcher at Delphi Digital
Compiled by Pzai, Foresight News
Given that Cryptocurrency is essentially Open Source software with built-in economic incentives, and AI is disrupting the way software is written, AI will have a huge impact on the entire blockchain field.
AI x Crypto overall stack
DeAI: Opportunities and Challenges
In my opinion, the biggest challenge facing DeAI lies in the infrastructure layer, as building the foundational models requires a significant amount of funding, and the scale of data and computation also yields high returns.
Given the expansion law, tech giants have a natural advantage: in the Web2 stage, they made huge profits from monopolizing profits from aggregating consumer demand, and reinvested these profits in cloud infrastructure over a decade of artificially lowering rates. Now, internet giants are trying to dominate the AI market by occupying data and computing (key elements of AI).
Comparison of the token volume of the large model
Due to the capital intensity and high bandwidth requirements of large-scale training, a unified supercluster is still the best choice - providing technology giants with the best-performing closed-source models - they plan to rent these models at monopolistic profits and reinvest the proceeds in each subsequent generation of products.
However, it turns out that the moat in the AI field is shallower than the network effect of Web2, and leading edge models quickly depreciate relative to the field, especially with Meta’s “scorched earth policy” and the Open Source advanced models such as Llama 3.1, which have been developed with billions of dollars and have achieved SOTA performance.
Llama 3 Model Rating
At this point, the emerging research on low latency distributed training methods may (partially) commercialize cutting-edge business models - as intelligent pricing falls, competition will (at least partially) shift from hardware superclusters (favoring tech giants) to software innovation (slightly favoring Open Source / Cryptocurrency).
Ability Index (Quality) - Training Price Distribution Chart
Given the computational efficiency of the ‘hybrid expert’ architecture and large model synthesis/routing, we are likely to face a world not only with 3-5 giant models, but a world consisting of millions of models, with different cost/performance trade-offs. An interwoven intelligent network (hive).
This constitutes a huge coordination problem: blockchain and Cryptocurrency incentive mechanisms should be able to help solve this problem well.
Core DeAI investment areas
Software is eating the world. AI is eating software. And AI is basically data and computation.
Delphi looks good at all components in this stack:
Simplified AI x Crypto Stack
Infrastructure
Given that the power of AI comes from data and computation, DeAI infrastructure is committed to efficiently acquiring data and computation, often using Cryptocurrency incentive mechanisms. As mentioned earlier, this is the most challenging part of the competition, but considering the scale of the end market, it may also be the most rewarding part.
Calculation
So far, distributed training protocols and the GPU market have been restricted by latency, but they hope to coordinate potential heterogeneous hardware to provide cost-effective, on-demand computing services for those who are excluded from the integrated solutions of giants. Companies such as Gensyn, Prime Intellect, and Neuromesh are driving the development of distributed training, while companies such as io.net, Akash, and Aethir are achieving low-cost inference closer to edge intelligence.
Distribution of project ecology positions based on aggregated supply
Data
In an ubiquitous intelligent world based on smaller, more specialized models, the value and monetization of data assets are increasing.
To date, DEP has been largely acclaimed for its ability to build hardware networks at lower costs compared to capital-intensive enterprises (such as telecom companies). However, the greatest potential market for DEP will emerge in the collection of new data sets, which will flow into on-chain intelligent systems: the agent protocol (discussed later).
In this world, the largest potential market - labor force is being replaced by data and computation. In this world, De AI infrastructure provides a way for non-technical people to seize the means of production and contribute to the upcoming network economy.
Middleware
The ultimate goal of DeAI is to achieve effective composability. Just like the capital Lego of Decentralized Finance, DeAI compensates for the shortcomings of absolute performance through permissionless composability, and the open ecosystem of incentivizing software and computational primitives continues to compound interest over time, with the hope of surpassing existing software and computational primitives.
If Google represents the ‘integrated’ extreme, then DeAI represents the ‘modular’ extreme. As Clayton Christensen reminds us, in emerging industries, the integrated approach often takes the lead by reducing friction in the value chain, but as the field matures, modular value chains will gain a foothold by increasing competition and cost efficiency in each layer of the stack:
Integrated vs Modular AI
We are very optimistic about several categories that are crucial to realizing this modular vision:
Routing
In an intelligent fragmented world, how to choose the right mode and time at the best price? The demand-side aggregator has been capturing value (see Aggregation Theory), and the routing function is crucial for optimizing the Pareto curve between performance and cost in the intelligent world of networks.
Bittensor has been at the forefront in the first-generation product, but has also seen many specialized competitors emerge.
Allora holds competitions between different models in different “themes” in a “contextual awareness” and self-improvement manner over time, and informs future predictions based on historical accuracy under specific conditions.
Morpheus’s goal is to become the “demand-side routing” for Web3 use cases - essentially a local proxy with Open Source that can grasp the relevant context of users and effectively route queries through the emerging components of Decentralized Finance or Web3’s “composable computing” infrastructure, known as “Apple Intelligence”.
Agent interoperability protocol, such as Theoriq and Autonolas, aims to push modular routing to the extreme, making flexible agents or component’s composable, composite ecosystem a fully mature on-chain service.
In short, in a smart and rapidly fragmented world, the supply and demand aggregator will play an extremely powerful role. If Google is a $2 million company that indexes information for the world, then the winner of the demand-side router - whether it’s Apple, Google, or a Web3 solution - is the company that acts as a proxy for smart indexing and will generate even greater scale.
Coprocessor
Given its decentralization, blockchain is greatly limited in terms of data and computation. How to introduce compute and data-intensive AI applications that users need into blockchain? Through co-processors!
Coprocessor in the Application Layer of Crypto
They all provide different technologies to ‘verify’ the underlying data or model used by the ‘Oracle Machine’, which can minimize the new trust assumptions on-chain and greatly improve its capabilities. So far, many projects have used zkML, opML, TeeML, and encryption economic methods, each with their own advantages and disadvantages:
Co-processor comparison
At a higher level, coprocessors are essential for the intelligence of smart contracts-providing solutions similar to “data warehouses” for querying for more personalized on-chain experiences, or verifying that a given reasoning is completed correctly.
TEE (Trusted Execution Environment) networks, such as SUPER, Phala, and Marlin, are becoming increasingly popular recently due to their practicality and ability to support large-scale applications.
Overall, coprocessors are crucial for integrating blockchains with high determinism but low performance and high-performance but probabilistic intelligent agents. Without coprocessors, AI would not appear in this generation of blockchains.
Developer Incentive Measures
One of the biggest problems with AI Open Source development is the lack of incentive mechanisms to make it sustainable. AI development is highly capital-intensive, and the opportunity cost of computing and AI knowledge work is very high. Without proper incentive measures to reward Open Source contributions, this field will inevitably lose to the Supercomputer of hypercapitalism.
From Sentiment to Pluralis, Sahara AI and Mira, the goal of these projects is to launch networks that allow individuals to contribute to network intelligence, while providing appropriate incentives.
Through the compensation in the business model, Open Source’s compound interest speed should accelerate - providing a global choice for developers and AI researchers outside of large technology companies, and is expected to receive generous rewards based on the created value.
Although it is very difficult to achieve, and the competition is becoming increasingly fierce, the potential market here is huge.
GNN Model
Large language models partition patterns in large text corpora and learn to predict the next word, while Graph Neural Networks (GNN) process, analyze, and learn from graph-structured data. Due to the fact that on-chain data is primarily composed of complex interactions between users and smart contracts, in other words, it forms a graph, GNN seems like a reasonable choice to support on-chain AI use cases.
Pond and RPS and other projects are trying to establish foundational models for web3, which may be applied in transactions, Defi, and even social use cases, such as:
AI Finance: Integration with existing Decentralized Finance applications, advanced yield strategies and Liquidity utilization, better Risk Management / governance
On-chain Marketing: More targeted Airdrop / positioning, based on on-chain behavior recommendation engine
These models will heavily leverage data warehouse solutions such as Space and Time, Subsquid, Covalent, and Hyperline, and I am also very bullish on them.
GNN can prove that large-scale models of blockchain and Web3 data warehouses are essential auxiliary tools, providing OLAP (Online Analytical Processing) functionality for Web3.
Application
In my opinion, on-chain Agents may be the key to solving the well-known user experience problem of Crypto Assets, but more importantly, over the past decade, we have invested billions of dollars in Web3 infrastructure, but the utilization rate by demand side has been very low.
Don’t worry, Agents are here…
AI’s test scores rise in various dimensions of human behavior
It also seems logical that these agents leverage an open, permissionless infrastructure that spans payments and composable computing to achieve a more complex end goal. In the coming networked smart economy, economic mobility may no longer be B-> B->C, but user-> agent-> computing network -> agent-> user. The end result of this flow is the proxy protocol. Application or service-based enterprises have limited overhead, run primarily on on-chain resources, and are much less expensive to meet the needs of end users (or each other) in a composable network than traditional enterprises. Just as Web2’s Application Layer captures most of its value, I’m also a big fan of the “fat proxy protocol” theory in DeAI. Over time, value capture should shift to the upper layers of the stack.
Value accumulation in Generative AI
The next Google, Facebook, and Blackrock are likely to be agents of protocol, and the components to implement these protocols are emerging.
DeAI endgame
AI will change our economic structure. Today, the market expects the capture of this value to be limited to a few large companies on the west coast of North America. DeAI represents a different vision. An open, composable vision of intelligent networks, rewarding and compensating even tiny contributions, and more collective ownership/management.
While some of the claims about DeAI may be exaggerated, and the trading prices of many projects are much higher than the current actual driving force, the scale of the opportunity is indeed quite objective. For those who are patient and have vision, the ultimate vision of DeAI truly computable may prove the rationality of the Block chain itself.
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
3 Likes
Reward
3
2
Repost
Share
Comment
0/400
YangzaiPanda
· 2024-10-17 02:59
Very wonderful sharing thank you for your sharing thank you very much
Delphi Digital: Opportunities, Challenges, and Future of DeAI
Author: PonderingDurian, Researcher at Delphi Digital
Compiled by Pzai, Foresight News
Given that Cryptocurrency is essentially Open Source software with built-in economic incentives, and AI is disrupting the way software is written, AI will have a huge impact on the entire blockchain field.
AI x Crypto overall stack
DeAI: Opportunities and Challenges
In my opinion, the biggest challenge facing DeAI lies in the infrastructure layer, as building the foundational models requires a significant amount of funding, and the scale of data and computation also yields high returns.
Given the expansion law, tech giants have a natural advantage: in the Web2 stage, they made huge profits from monopolizing profits from aggregating consumer demand, and reinvested these profits in cloud infrastructure over a decade of artificially lowering rates. Now, internet giants are trying to dominate the AI market by occupying data and computing (key elements of AI).
Comparison of the token volume of the large model
Due to the capital intensity and high bandwidth requirements of large-scale training, a unified supercluster is still the best choice - providing technology giants with the best-performing closed-source models - they plan to rent these models at monopolistic profits and reinvest the proceeds in each subsequent generation of products.
However, it turns out that the moat in the AI field is shallower than the network effect of Web2, and leading edge models quickly depreciate relative to the field, especially with Meta’s “scorched earth policy” and the Open Source advanced models such as Llama 3.1, which have been developed with billions of dollars and have achieved SOTA performance.
Llama 3 Model Rating
At this point, the emerging research on low latency distributed training methods may (partially) commercialize cutting-edge business models - as intelligent pricing falls, competition will (at least partially) shift from hardware superclusters (favoring tech giants) to software innovation (slightly favoring Open Source / Cryptocurrency).
Ability Index (Quality) - Training Price Distribution Chart
Given the computational efficiency of the ‘hybrid expert’ architecture and large model synthesis/routing, we are likely to face a world not only with 3-5 giant models, but a world consisting of millions of models, with different cost/performance trade-offs. An interwoven intelligent network (hive).
This constitutes a huge coordination problem: blockchain and Cryptocurrency incentive mechanisms should be able to help solve this problem well.
Core DeAI investment areas
Software is eating the world. AI is eating software. And AI is basically data and computation.
Delphi looks good at all components in this stack:
Simplified AI x Crypto Stack
Infrastructure
Given that the power of AI comes from data and computation, DeAI infrastructure is committed to efficiently acquiring data and computation, often using Cryptocurrency incentive mechanisms. As mentioned earlier, this is the most challenging part of the competition, but considering the scale of the end market, it may also be the most rewarding part.
Calculation
So far, distributed training protocols and the GPU market have been restricted by latency, but they hope to coordinate potential heterogeneous hardware to provide cost-effective, on-demand computing services for those who are excluded from the integrated solutions of giants. Companies such as Gensyn, Prime Intellect, and Neuromesh are driving the development of distributed training, while companies such as io.net, Akash, and Aethir are achieving low-cost inference closer to edge intelligence.
Distribution of project ecology positions based on aggregated supply
Data
In an ubiquitous intelligent world based on smaller, more specialized models, the value and monetization of data assets are increasing.
To date, DEP has been largely acclaimed for its ability to build hardware networks at lower costs compared to capital-intensive enterprises (such as telecom companies). However, the greatest potential market for DEP will emerge in the collection of new data sets, which will flow into on-chain intelligent systems: the agent protocol (discussed later).
In this world, the largest potential market - labor force is being replaced by data and computation. In this world, De AI infrastructure provides a way for non-technical people to seize the means of production and contribute to the upcoming network economy.
Middleware
The ultimate goal of DeAI is to achieve effective composability. Just like the capital Lego of Decentralized Finance, DeAI compensates for the shortcomings of absolute performance through permissionless composability, and the open ecosystem of incentivizing software and computational primitives continues to compound interest over time, with the hope of surpassing existing software and computational primitives.
If Google represents the ‘integrated’ extreme, then DeAI represents the ‘modular’ extreme. As Clayton Christensen reminds us, in emerging industries, the integrated approach often takes the lead by reducing friction in the value chain, but as the field matures, modular value chains will gain a foothold by increasing competition and cost efficiency in each layer of the stack:
Integrated vs Modular AI
We are very optimistic about several categories that are crucial to realizing this modular vision:
Routing
In an intelligent fragmented world, how to choose the right mode and time at the best price? The demand-side aggregator has been capturing value (see Aggregation Theory), and the routing function is crucial for optimizing the Pareto curve between performance and cost in the intelligent world of networks.
Bittensor has been at the forefront in the first-generation product, but has also seen many specialized competitors emerge.
Allora holds competitions between different models in different “themes” in a “contextual awareness” and self-improvement manner over time, and informs future predictions based on historical accuracy under specific conditions.
Morpheus’s goal is to become the “demand-side routing” for Web3 use cases - essentially a local proxy with Open Source that can grasp the relevant context of users and effectively route queries through the emerging components of Decentralized Finance or Web3’s “composable computing” infrastructure, known as “Apple Intelligence”.
Agent interoperability protocol, such as Theoriq and Autonolas, aims to push modular routing to the extreme, making flexible agents or component’s composable, composite ecosystem a fully mature on-chain service.
In short, in a smart and rapidly fragmented world, the supply and demand aggregator will play an extremely powerful role. If Google is a $2 million company that indexes information for the world, then the winner of the demand-side router - whether it’s Apple, Google, or a Web3 solution - is the company that acts as a proxy for smart indexing and will generate even greater scale.
Coprocessor
Given its decentralization, blockchain is greatly limited in terms of data and computation. How to introduce compute and data-intensive AI applications that users need into blockchain? Through co-processors!
Coprocessor in the Application Layer of Crypto
They all provide different technologies to ‘verify’ the underlying data or model used by the ‘Oracle Machine’, which can minimize the new trust assumptions on-chain and greatly improve its capabilities. So far, many projects have used zkML, opML, TeeML, and encryption economic methods, each with their own advantages and disadvantages:
Co-processor comparison
At a higher level, coprocessors are essential for the intelligence of smart contracts-providing solutions similar to “data warehouses” for querying for more personalized on-chain experiences, or verifying that a given reasoning is completed correctly.
TEE (Trusted Execution Environment) networks, such as SUPER, Phala, and Marlin, are becoming increasingly popular recently due to their practicality and ability to support large-scale applications.
Overall, coprocessors are crucial for integrating blockchains with high determinism but low performance and high-performance but probabilistic intelligent agents. Without coprocessors, AI would not appear in this generation of blockchains.
Developer Incentive Measures
One of the biggest problems with AI Open Source development is the lack of incentive mechanisms to make it sustainable. AI development is highly capital-intensive, and the opportunity cost of computing and AI knowledge work is very high. Without proper incentive measures to reward Open Source contributions, this field will inevitably lose to the Supercomputer of hypercapitalism.
From Sentiment to Pluralis, Sahara AI and Mira, the goal of these projects is to launch networks that allow individuals to contribute to network intelligence, while providing appropriate incentives.
Through the compensation in the business model, Open Source’s compound interest speed should accelerate - providing a global choice for developers and AI researchers outside of large technology companies, and is expected to receive generous rewards based on the created value.
Although it is very difficult to achieve, and the competition is becoming increasingly fierce, the potential market here is huge.
GNN Model
Large language models partition patterns in large text corpora and learn to predict the next word, while Graph Neural Networks (GNN) process, analyze, and learn from graph-structured data. Due to the fact that on-chain data is primarily composed of complex interactions between users and smart contracts, in other words, it forms a graph, GNN seems like a reasonable choice to support on-chain AI use cases.
Pond and RPS and other projects are trying to establish foundational models for web3, which may be applied in transactions, Defi, and even social use cases, such as:
These models will heavily leverage data warehouse solutions such as Space and Time, Subsquid, Covalent, and Hyperline, and I am also very bullish on them.
GNN can prove that large-scale models of blockchain and Web3 data warehouses are essential auxiliary tools, providing OLAP (Online Analytical Processing) functionality for Web3.
Application
In my opinion, on-chain Agents may be the key to solving the well-known user experience problem of Crypto Assets, but more importantly, over the past decade, we have invested billions of dollars in Web3 infrastructure, but the utilization rate by demand side has been very low.
Don’t worry, Agents are here…
AI’s test scores rise in various dimensions of human behavior
It also seems logical that these agents leverage an open, permissionless infrastructure that spans payments and composable computing to achieve a more complex end goal. In the coming networked smart economy, economic mobility may no longer be B-> B->C, but user-> agent-> computing network -> agent-> user. The end result of this flow is the proxy protocol. Application or service-based enterprises have limited overhead, run primarily on on-chain resources, and are much less expensive to meet the needs of end users (or each other) in a composable network than traditional enterprises. Just as Web2’s Application Layer captures most of its value, I’m also a big fan of the “fat proxy protocol” theory in DeAI. Over time, value capture should shift to the upper layers of the stack.
Value accumulation in Generative AI
The next Google, Facebook, and Blackrock are likely to be agents of protocol, and the components to implement these protocols are emerging.
DeAI endgame
AI will change our economic structure. Today, the market expects the capture of this value to be limited to a few large companies on the west coast of North America. DeAI represents a different vision. An open, composable vision of intelligent networks, rewarding and compensating even tiny contributions, and more collective ownership/management.
While some of the claims about DeAI may be exaggerated, and the trading prices of many projects are much higher than the current actual driving force, the scale of the opportunity is indeed quite objective. For those who are patient and have vision, the ultimate vision of DeAI truly computable may prove the rationality of the Block chain itself.