Search results for "GPT"
2026-04-02
02:29

Gate delisted 29 token trading markets and has repurchased for users who meet the requirements.

Gate News message, according to Gate’s official announcement Gate has delisted the trading markets of 29 tokens, including POWERLOOM, ZAP, NVIR, PBX, MNRY, HAT, EARNM, VICE, FST, LBLOCK, ELIX, ELIZA, GET, VIA, SGC, SKAI, 1, GPT, DPET, EQ, EMPIRE, ELDE, CNNS, MYTH, HARD, TROY, MICHI, OVPP, and ROOST. After the platform re-evaluated these tokens again, they do not meet the standards for enabling trading. For users who have submitted forms and meet the requirements, Gate has credited the repurchase amount to their accounts. After delisting, users can still continue to use the Gate wallet to store assets. The specific delisting time and operational details for the related wallet functions will be notified separately.
More
10:00

The exchanges for trading 29 tokens such as POWERLOOM and ZAP have been delisted from Gate, and they have conducted buybacks for eligible users.

Gate News: According to Gate’s official announcement, Gate has delisted the trading markets for 29 tokens, including POWERLOOM, ZAP, NVIR, PBX, MNRY, HAT, EARNM, VICE, FST, LBLOCK, ELIX, ELIZA, GET, VIA, SGC, SKAI, 1, GPT, DPET, EQ, EMPIRE, ELDE, CNNS, MYTH, HARD, TROY, MICHI, OVPP, ROOST, etc. After assessment, the above tokens do not meet the standards for enabling trading. Gate has conducted a buyback for users holding these tokens; for users who meet the requirements, the buyback amount has been deposited into their Gate accounts. After delisting, users can still continue using Gate as a wallet to store assets for the aforementioned coins. The specific delisting time for wallet functionality and the operational details will be notified separately through an official announcement.
More
05:07

20B small model search capabilities catch up with GPT-5 and Opus: open-source agent search model for vector databases Chroma Context-1

The open-source vector database Chroma has released Context-1, a 20 billion parameter search model for multi-turn retrieval. This model utilizes self-editing context technology, trained on multiple tasks through reinforcement learning and a curriculum mechanism, demonstrating outstanding performance, particularly in the fields of web, finance, and law, and also showcasing cross-domain search capabilities in the email sector.
More
04:22

Is the "GPT Moment" for Embodied AI Drawing Near? Axis Robotics Announces End of Testing, Set to Launch on Base Chain

# Article Source: Axis Axis Robotics is restructuring data diversity and scaled production methods for embodied AI through a simulation-first strategy. In 2025, multiple technological pathways in the robotics industry are rapidly converging: the commoditization upgrade of embodied hardware supply chains has made it possible for expensive prototypes to achieve scaled deployment for the first time; Vision-Language-Action (VLA) models bring robots the "brain" for semantic understanding, reasoning, and planning; and the multi-layered data pyramid composed of video priors to simulation synthesis continues to supply steady fuel for the continuous evolution of embodied AI. However, the industry still faces one critical bottleneck: data. Compared to large language models and autonomous driving, embodied AI still faces a massive data
More
01:39

AI programming tool Cursor releases Composer 2 model, with performance surpassing Opus 4.6, and the price reduced to 14% of the previous generation.

Cursor released Composer 2, its third-generation programming model, on March 20th, with pricing significantly reduced to $0.50 per million input tokens and $2.50 per million output tokens, while also launching a faster version. Composer 2 outperforms its predecessor on multiple benchmarks but falls short of GPT-5.4, with improvements primarily stemming from continued pre-training and reinforcement learning of the base model. The model is exclusively available for internal use within Cursor, and its parent company Anysphere has a valuation of $29.3 billion.
More
15:32

Rakuten Group Releases Japanese Language Large Model Rakuten AI 3.0, Sparks Controversy Over Suspected DeepSeek Architecture Foundation

Rakuten Group launched Rakuten AI 3.0, a Japanese-specialized large language model, on March 17, claiming to surpass GPT-4o on multiple Japanese language tests. However, netizens discovered that the model may be based on DeepSeek development and raised questions about its pro-China stance and the authenticity of its independently developed technology, sparking discussions.
More
08:53

Rakuten announces Rakuten AI 3.0 model, with configuration files indicating the underlying architecture as DeepSeek V3

Rakuten Group announced the release of Rakuten AI 3.0 on March 17, a high-performance AI model with 671 billion parameters optimized for Japanese. The company claims that it outperforms GPT-4o on multiple benchmark tests. The model is freely open-sourced under the Apache 2.0 license, fine-tuned based on the DeepSeek V3 model, and received training computational support from the Japanese government.
More