On March 9, Mina talked with Steve and Phil Kelly of her incubator o1 Labs, starting from what she saw and heard at ETHDenver, describing the future trends contained in this event. The two parties revealed cutting-edge news about the industry’s top activities from the aspects of the combination of AI and encryption industry, use cases of ZK’s verifiability, implementation logic of recursive proofs, VM replacement feasibility analysis and future trend analysis.

The following is compiled and compiled by Odaily Planet Daily.
**Mina: Please introduce yourselves and talk about your favorite side activities in this ETHDenver event and who do you think is the best performing project in this event? **
Steve: Hi everyone, I’m Steve, the product lead at o1 Labs. I entered the Web3 industry very early and am relatively good at the ZK field. Before joining o1 Labs, I worked at Cloudflare for five years. My experience with Cloudflare has given me some insights into ecological construction.
My personal favorite side activity is Mina’s. Mina’s events always have a variety of people, such as super professional zk cryptographers, privacy/libertarian blockchain enthusiasts.
Apart from Mina, personally, I prefer ZK Sync’s activities. Their book “Explain ZK to Me in the Language of a Five-Year-Old” uses storytelling to provide a lively and entertaining explanation of the protocol engine. For me, this style of giveaway is right up my alley. So excited to share it with you all.
Phil: Hello everyone, I am Phil Kelly, business development at o1 Labs. The scope of my abilities is in financial services and computer technology. I have worked at companies like ConsenSys and dabbled in Ethereum. Moving from Ethereum to the ZK world is a very steep learning curve, which makes me both excited and fulfilled.
For personal reasons, I like to explore and understand each event in depth, so each event is a valuable learning opportunity for me in the process of communicating with everyone. Except for Mina, I was attracted by Warm KX Mag. Was invited to their event for a deep dive into ZK and the new proof system. I’m so excited to share it with you today.
**Mina: The combination of artificial intelligence (AI) and cryptocurrency (Crypto) is really attracting attention in today’s technology industry. This topic touches on two very hot areas. Combining artificial intelligence and cryptocurrency can create many new applications and solutions. For example, use AI technology to optimize cryptocurrency trading strategies and improve investment returns; or use blockchain technology to enhance the security and transparency of artificial intelligence models to ensure data privacy and security. So in your opinion, what are the biggest advantages of combining artificial intelligence and cryptocurrency? What new applications and solutions can they bring? **
Steve: Combining artificial intelligence and cryptocurrency is indeed a technology trend that has attracted a lot of attention in the technology circle. This combination provides us with a completely new way to think about and implement technological innovation, while also attracting the attention of many investors and enthusiasts.
The characteristics of cryptocurrency allow people to use digital tokens to represent various assets or entities and conduct speculative transactions, which to a certain extent increases market activity and investor interest. Moreover, one of the characteristics of the cryptocurrency market is its borderless nature, which means that no matter which country you are in, as long as you have an Internet connection, you can participate, which further expands the size and appeal of the market.
Of course, in addition to speculation and trading, the combination of artificial intelligence and cryptocurrency has many practical application values. For example, use smart contracts to execute automated trading strategies, or use blockchain technology to ensure data security and transparency. These applications can not only bring revolutionary changes to the financial field, but also have a significant impact in other industries.
Therefore, the combination of artificial intelligence and cryptocurrency is not only a hot technology trend, but also an area full of potential that deserves our in-depth discussion and research.
Phil: People often talk about artificial intelligence and cryptocurrency in a vague way because it is a very broad topic that touches on many different aspects. Even if we are able to pin down the topic, there are still many questions that need to be answered, such as how quickly we can implement cryptographic machine learning (CKML) that can solve the challenges posed by emerging large-scale natural language models. This is definitely a feasibility challenge. On the other hand, I think what some people are talking about might be ZK Machine Learning (ZKML).
When we dig into some of the details of what they’re talking about, what they’re actually talking about is the output of running a machine learning model, which can be quite complex at this point, but it might just be a linear equation. With current technology, you can achieve this in a zero-knowledge proof system. I think there’s something worth paying attention to. I want to reiterate my point that privacy has always been a focus of our discussions in ZK and verifiable computing.
Steve: After hearing Phil’s discussion, I would like to add that my attitude towards ZK Machine Learning (ZKML) has become more optimistic. At first I had the same view as Phil, thinking that using ZK technology would increase a lot of overhead. After all, training models and inference are not cheap processes. So why combine the two? Because they can increase the performance of ZK technology tenfold and reduce overhead. I think as ZK technology becomes more efficient, the amount of calculations we can perform will also increase accordingly. This trend seems inevitable.
As a product person, I like to use product examples to illustrate. Solution: The raw footage content obtained from the camera needs to be signed. So we need device manufacturers to do that, everyone on board, and they all have signing keys. When blurring, sharpening, cropping, or other operations are performed on it, this already signed content can be done in a zero-knowledge proof. And there are browsers that are able to follow a chain, like Mina, to be able to say, "Here’s this image. It has a certificate that proves it came from a real camera, and just these operations were done, and we know roughly what it looks like .
**Mina: In the crypto world, when talking about ZK, the first thing that most people think of is scalability, and the second thing is probably privacy, but I think verifiability has been ignored to some extent, what do you think? **
Steve: I always talk about ZK with an emphasis on scalability and privacy. I often test other people’s opinions as well and find that people often default to privacy because it’s the easiest to explain. But when you actually look at the market, the investment in scalability far outweighs privacy, like all the zk Rollups and Mina. But you make a good point that verifiability is different from those two, and while they are related, you only extend something if you generate and verify the proof. But this use case for verifiability is primary. Like we just discussed, if I’m looking at an image, I’m more concerned not with the scalability or privacy aspects, but with whether it’s verifiable, and whether I can trust that verifiability.
**Mina: In the blockchain world, many times verifiability is equated with full disclosure. What we provide is that verifiability can be achieved even without fully disclosing the underlying data, which involves privacy issues. However, I think verifiability is very important when we face an Internet full of misinformation and confusing sources. You can think of ZK as programmable truth, which is a topic that we have been discussing and exploring. I feel like there are a lot of synergies between ZK and AI, and while everyone is talking about cryptocurrencies and AI, I feel like without the addition of ZK, it’s impossible for the convergence of cryptocurrency and AI to completely avoid a dystopia. Next, let’s talk about the next topic. I see a lot of discussion surrounding proving aggregation. First, can one of you give everyone a high-level overview? For those who may not know much, what is proof aggregation? **
** Steve: ** The concept of proof aggregation is that you can have a proof and you can verify it, which takes a certain amount of time. If you repeat this process 1,000 times, that’s equivalent to 1,000 times this constant time. So the time cost will gradually increase. If there’s a gas charge or any kind of charge associated with this, then you’ll be charged an additional 1,000 times. What happens with proof aggregation is that you can pass the first proof as input to the second proof, the second proof as input to the third proof, and so on, and you can aggregate these 1,000 proofs into a single proof that only needs to be verified once. Thanks to the magic of cryptography, you still get the same cryptographic guarantee that if one proof passes, then each of those 1,000 proofs will independently pass. This is just a way to further increase scale and further reduce costs. As we will discuss, this is one of some of the core technologies driving the Mena protocol.
Phil: I think there are some trade-offs involved in doing more proofs, since you need to spend more computational power generating the proofs and merging them with other additional proofs. This increases the computational cost but reduces the cost of verification; or you can simply send more proofs to the verification environment and pay more verification fees. In fact, this is a good example of the flexibility ZK has. There is suddenly a discussion about aggregation in the market, which means that many people are starting to consider the practicality of ZK. Aggregation has always been just an idea, but now people are actively building it, sending it, thinking about the chores and costs of authenticating it, and I think that’s a really good sign.
**Mina: At the ETHDenver event, everyone was discussing how to replace VM. First, Steve would like to introduce the basic concept of VM to everyone, and then explain what are the advantages and obstacles of replacing VM? **
**Steve: **Speaking of VM, let’s start with the use case of CPU. As time goes by, everyone gradually discovers that it is not just about having more chips and more CPUs, but you can actually run on top of the CPU. Virtual CPU, virtual CPU can be said to be the first VM. In a VM, you can reuse existing instruction sets, you can define instructions, and then execute them on any hardware. In our industry, the most famous version is EVM, which later became the core of Ethereum and achieved great success.
Replacing VMs was a hot topic at the event, and I spoke on one of the panels. Should we continue to invest in older VMs and improve VMs? Like making it faster, especially if it’s compatible. It becomes clear that a better way to achieve consensus than re-executing the same instructions in the VM is better. A better approach is to generate a proof that the same computation occurs. All the verifier needs to do is verify that proof. This coincides with ZK Rollup.
**Phil:**To me, these discussions in the industry are a bit like what you see in the early stages of technology, when people don’t fully realize how promising this industry is going to be and how many Areas of specialization will emerge.
I think that, with the exception of Ethereum, all alternatives have a huge role to play, can perform specific tasks, and play an important role in the architecture of Web3. I believe that one of these tasks is best accomplished outside of Ethereum’s virtual machine, and possibly co-existing with other specialized virtual machines.
**Mina: At the ETHDenver event, everyone mentioned the important topic of EIP 4844. Steve, can you briefly describe what this is and why it’s a big deal? **
Steve: Let me explain it with specific data. Theoretically, the Ethereum ecosystem will become more popular after the launch of EIP 4844, but the fees for transfers or smart contract execution will start to become US dollars. In 2022, this fee has risen to US$50 or even more than US$100. Taking deployment contracts as an example, this is obviously unsustainable.
The ecosystem responded with second- and second-tier solutions, which grew, attracted a lot of activity, and were cheaper. Fees range from 15 cents to around $1. Basic token exchange fees range from 50 cents to I’ve seen even $3.
Although the expansion plan has also been affected by the activity. Ethereum is making changes, and they specifically recognize that transactions from layer 2, where there is just a bunch of data representing the transaction, will be processed differently and consume less gas. So the Ethereum transaction fees from a minimum of $0.14 to a maximum of $1 that I just quoted will drop significantly. So I don’t know the exact numbers, but in the second tier, everything becomes cheaper. This is good for the market. But I also want to point out that there’s a bit of a highway analogy. When a highway becomes congested and you add a few more lanes, what happens? Will there be less traffic? Of course not, more vehicles will be on the road and the problem will remain the same.
This is what happens when the model is a consensus network of nodes reaching consensus. Ethereum has approximately 1.5 million nodes, all re-executing the same transactions to reach consensus. This will improve things over time. They are doing other things to improve further. But ultimately, the way to achieve true scalability is to use zero-knowledge proofs. We think of it as client-side zero-knowledge proof, where you generate the proof on the client side, and the verification cost is low and constant time. They do good things for users and reduce costs.
**Mina: In addition to the above-mentioned hot events of this event, are there any new discoveries you can share with everyone during the event? **
**Phil:**DePIN is one of the trends that I think will be in the future. For those who don’t know, DePIN refers to decentralized physical infrastructure. Helium is the best example of the last big cycle and I think it was one of the most successful projects before the bull market hit. But there are more similar projects brewing. For example, some people may have learned about Demo, which allows you to use devices in the car to interact with the chain. You will find that many projects involving custom physical devices are emerging, and these networks Deployment in progress. There is also WeatherXm, which allows citizens to collect weather data and achieves decentralized collection of weather data. There are also alternatives available that can enhance the functionality of traditional telecom service providers.
So DePIN is definitely an emerging field. And what I like about it is that when you’re processing a lot of data at the edge of the network, you want to be able to do calculations on the data and then send it back instead of transmitting all the data. In fact, at a DePIN event, someone was demonstrating a thermometer at the edge of the network. Obviously, it’s constantly measuring the temperature, but instead of sending a reading every time, it could be programmed to monitor the temperature range and then periodically send a message like “The temperature here has been in the range of 10° to 20°.”
Here, you get a succinct expression that summarizes a bunch of data points instead of transmitting a huge amount of data. This particular example is a very good one because when precious cargo is transported around the world, for example, temperature measurement is often a very critical factor. Currently, there are some more traditional ways of measuring this data, such as using devices that record when the device breaks out of range to ensure the drug is being delivered correctly. But it would be very useful if you could use a sensor that succinctly tells you the temperature range during or at the end of the trip.