Futures
Access hundreds of perpetual contracts
TradFi
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Introduction to Futures Trading
Learn the basics of futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to practice risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Pre-IPOs
Unlock full access to global stock IPOs
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
Totally boiling over! The Shanghai Composite Index squeezes shorts and rockets higher, surging toward and nearing 4,000 points; Zhipu performance stacks up beyond its class, with a DeepSeek mode rebuild.
Ask AI · How does Zhipu GLM-5.1’s performance outperform expectations affect the competitiveness of domestic models?
The three major A-share indices showed strong momentum today, with the Shanghai Composite gaining a triple-digit advance and closing in on 4,000 points. By the close, the Shanghai Composite rose 2.70%, closing at 3,995.00 points; the Shenzhen Component Index rose 4.79%, closing at 14,042.50 points; the ChiNext Index rose 5.91%, closing at 3,347.61 points. The combined trading value across the Shanghai, Shenzhen, and Beijing markets reached 2.45 trillion yuan, up sharply by 8272 billion yuan from yesterday. The number of advancing stocks was close to 5,200, with more than 130 stocks hitting the daily limit. The AI industry chain has collectively surged.
According to a report by the “Science and Technology Innovation Board Daily” on April 8, Zhipu officially released its new generation open-source model GLM-5.1. As introduced, it is the only open-source model that can sustain operation for 8 hours. In the SWE-bench Pro benchmark test, which closely resembles real software development, GLM-5.1 achieved the first instance of a domestic model surpassing Opus 4.6. OpenRouter shows that, alongside this release, Zhipu GLM’s price has once again been raised by 10%. After the price adjustment, the cache-hit token price of GLM-5.1 in coding scenarios has approached the level of Anthropic’s Claude Sonnet 4.6. This is the first time a domestic large model has aligned its pricing with that of leading overseas vendors in core scenarios.
According to a report by “Yicai” on April 8, DeepSeek has rolled out an important update. In the latest version, “Quick Mode” and “Expert Mode” have been added above the DeepSeek input box. This is the first time since DeepSeek became popular that the product has introduced a tiered mode design at the product level. Quick Mode is suitable for everyday conversations, providing instant responses, and supports text recognition from images and files. Expert Mode excels at complex problems, supports deep thinking and intelligent search. Currently, file uploads and multimodal functions are not supported. DeepSeek also reminds users that if they encounter peak periods in this mode, they will need to wait.
There are also reports indicating that DeepSeek V4.0’s grayscale testing has been launched. Unlike prior silent upgrades, this time the user interface has a clear mode-switching entry. Three options—Quick Mode (default), Expert Mode (expert), and Vision Mode (vision)—are displayed side by side at the top of the chat interface. This is the most significant change in DeepSeek’s product form since the grayscale testing of million-level context began on February 11.
CITIC Securities previously said in a note that DeepSeek’s next-generation model (such as V4.0) is expected to integrate the Engram module into the mature DSA+MoE architecture. By storing key information in layers, the computational load of attention layers can be reduced exponentially, thereby enabling ultra-long-context processing and improving efficiency. It is expected to further refine coding and Agent capabilities, and address multimodal shortcomings to continue the high cost-performance route. DeepSeek-V3.2 previously reduced the pricing of input and output tokens by 60% and 75%, respectively.
Market-related sector overview
Computing power infrastructure sector: Computing power is the “water and electricity” of AI development. Against the backdrop of exponential growth in model parameter counts, demand for computing power is seeing a surge. This sector mainly covers sub-segments such as AI servers, optical modules, switches, and liquid cooling equipment. With the nationwide acceleration of intelligent computing center construction, and continued expansion of capital expenditures by leading internet giants, order fulfillment for upstream computing hardware has improved significantly. Especially for the optical module segment, which represents high-speed data transmission, domestic companies hold a dominant share in the global market. The logic of volume-and-price growth is clear, driven by technology iteration (such as evolving toward 800G and 1.6T). At the same time, the heat-dissipation pain points caused by high-density computing are accelerating the move of liquid cooling technology from “optional” to “required,” opening up a new incremental market space for related thermal management companies. From the perspective of industry patterns, before a major application boom, the earnings realization period for the computing base is often the most certain.
AI large models and software application sector: As domestic foundational large-model capabilities mature and API calling costs decline, the “battle of a hundred models” is gradually evolving toward “application deployment.” This sector covers areas such as general-purpose software, office software, financial IT, and education/healthcare informatization. The industry logic is that AI is not merely a replacement for traditional software, but rather it reconstructs human-computer interaction, significantly increasing the usage frequency of software products and ARPU (average revenue per user). At present, many A-share software leaders have deeply embedded AI large models into their core product lines and launched innovative features such as AI assistants and intelligent generation. From a business-model perspective, this helps SaaS companies improve customer stickiness and break through the growth bottlenecks of prior digital transformation. Although AI’s revenue contribution from some application-side companies is still at an early stage, market expectations for increased penetration in the future have resulted in relatively high valuation premiums.
AI terminals and edge computing sector: As large models develop toward lightweight versions, it is inevitable that AI moves from the cloud to the device side. This sector focuses on hardware terminals such as AI PCs, AI smartphones, smart wearables, and edge computing chips. According to industry institutions’ forecasts, 2024 to 2025 will be a key period for the rapid rise in penetration of AI terminals. For the consumer electronics industry chain, the addition of AI elements is expected to break the sluggish upgrade cycles of smartphones and personal computers in recent years, sparking a new round of “AI upgrade” wave. Domestic industry chains have strong manufacturing advantages and supporting capabilities in areas such as acoustics, optics, structural components, and edge-side SoC chip design. The rise of edge computing not only alleviates the pressure on cloud-side computing power, but also, thanks to its advantages in low latency and data privacy protection, opens up huge local application scenarios such as smart security and smart home. This brings related electronics manufacturing companies a twofold opportunity: both business-performance recovery and valuation reshaping.
Risk warning: The industry information and company developments mentioned in this article are for general reference only and do not constitute any investment advice. There is uncertainty regarding business operations and market fluctuations—please be mindful of related risks.