SoftBank Plans AI Data Center Battery Plant in Osaka, Targeting Production Within Five Years

Gate News message, April 24 — SoftBank Corp, the mobile unit of Japan’s SoftBank Group, plans to convert part of its factory in Sakai, Osaka into a large battery production line for AI data centers. CEO Junichi Miyakawa is expected to announce the project in May as part of a new five-year plan, with production targeted to begin within five years. The facility will supply batteries initially to SoftBank’s own data centers at the former Sharp LCD factory the company acquired for 100 billion yen (approximately $627 million) in 2025.

The battery plant could produce several gigawatt-hours of batteries, positioning it as one of Japan’s largest production lines. The broader Sakai campus spans 750,000 square meters and is designed for more than 150 megawatts of power for AI computing, with the AI data center expected to become operational by the end of 2026.

SoftBank’s move reflects a wider industry trend of tech companies securing energy infrastructure to power AI expansion. SoftBank Group CEO Masayoshi Son has also discussed an Ohio project featuring a 9.2-gigawatt gas-fired power plant to support 10 gigawatts of data center capacity. Additionally, OpenAI and SoftBank Group invested $1 billion in SB Energy, a renewable energy developer, to support planned AI data center expansion. Global AI infrastructure is expanding rapidly, with approximately 22.8 gigawatts of data center IT capacity currently under construction.

Disclaimer: The information on this page may come from third parties and does not represent the views or opinions of Gate. The content displayed on this page is for reference only and does not constitute any financial, investment, or legal advice. Gate does not guarantee the accuracy or completeness of the information and shall not be liable for any losses arising from the use of this information. Virtual asset investments carry high risks and are subject to significant price volatility. You may lose all of your invested principal. Please fully understand the relevant risks and make prudent decisions based on your own financial situation and risk tolerance. For details, please refer to Disclaimer.

Related Articles

DeepSeek releases the V4 open-source preview, with a technical score of 3206 surpassing GPT-5.4

DeepSeek officially launched the V4 preview series on April 24, with open-sourced model weights under the MIT license, and the model weights have been also released on Hugging Face and ModelScope. According to the DeepSeek V4 technical report, V4-Pro-Max (the highest inference intensity mode) scored 3206 points on the Codeforces benchmark, surpassing GPT-5.4.

MarketWhisper5m ago

Cambricon Completes Day 0 Adaptation of DeepSeek-V4, Marking Milestone for China's AI Chip Ecosystem

Gate News message, April 24 — Cambricon announced today that it has completed Day 0 adaptation of DeepSeek-V4, the latest large language model from DeepSeek, using its proprietary NeuWare software ecosystem and vLLM framework. The adaptation code has been open-sourced simultaneously, marking the

GateNews21m ago

Tencent open-sourced Hy3 preview version, code benchmark tests improved by 40% over the previous generation

Tencent officially open-sourced the Hy3 preview version of a large language model on April 23 on GitHub, Hugging Face, and the ModelScope platform, and simultaneously provided paid API service via Tencent Cloud. According to Decrypt’s report on April 24, the Hy3 preview version began training in late January and reached the publication calendar in less than three months.

MarketWhisper29m ago

FTX Portfolio Investments Worth 158 Trillion Won If Not Bankrupt

FTX, the centralized cryptocurrency exchange that filed for Chapter 11 bankruptcy protection in November 2022 due to liquidity shortages and capital outflows, would have held investments valued at approximately 158.796 trillion won if it had not collapsed, according to analysis cited by Park

CryptoFrontier31m ago

Xiaomi Reveals MiMo-V2-Pro Training Details: 1T Model Parameters, Thousands of GPUs Deployed

Gate News message, April 24 — Xiaomi's large language model team lead Luo Fuli disclosed in an in-depth interview that the MiMo-V2-Pro model has 1 trillion parameters in total and required thousands of GPUs for training. She noted that the 1T scale represents the minimum threshold to achieve

GateNews46m ago

DeepSeek V4 Achieves Perfect Score on Putnam-2025, Ties with Axiom in Formal Math Reasoning

Gate News message, April 24 — DeepSeek V4 has published results from formal mathematical reasoning evaluations, achieving a perfect score of 120/120 on Putnam-2025, tying with Axiom for first place. In the practical regime using LeanExplore and constrained sampling, V4-Flash-Max scored 81.00 on the

GateNews54m ago
Comment
0/400
No comments