日本一二三四五区日韩精品,日韩新版短视频无码,国产精品黄色大片在线看

伊人久久大香线蕉成人网-肥胖女人的性生活视频-日韩国产有码精品一区二在线-奇米精品一区二区三区四区

China Focus: China unveils brain-inspired AI for next-gen efficient computing

Source: Xinhua

Editor: huaxia

2025-09-08 18:30:45

BEIJING, Sept. 8 (Xinhua) -- Breaking from models like ChatGPT, a team of Chinese researchers has developed a novel AI system that mimics brain neurons, charting a new course for next-gen energy-efficient computing and hardware.

Scientists from the Institute of Automation under the Chinese Academy of Sciences introduced "SpikingBrain-1.0," a large-scale model trained and inferred entirely on home-grown GPU computing.

Unlike mainstream generative AI systems that rely on the resource-intensive Transformer architecture -- where intelligence grows with ever-larger networks, computing budgets and datasets -- the novel model pursues a different path, allowing intelligence to emerge from spiking neurons.

This model enables highly efficient training on extremely low data volumes. Using only about 2 percent of the pre-training data required by mainstream large models, it achieves performance comparable to multiple open-source models on language understanding and reasoning challenges, according to the team.

By harnessing event-driven spiking neurons at the inference stage, one SpikingBrain variant is shown to deliver a 26.5-fold speed-up over Transformer architectures when generating the first token from a one-million-token context.

The model's ability to handle ultra-long sequences offers clear efficiency gains for tasks such as legal or medical document analysis, high-energy particle-physics experiments and DNA sequence modeling.

The research team has open-sourced the SpikingBrain model and launched a public test page, along with releasing a large-scale, industry-validated bilingual technical report.

"This large model opens up a non-Transformer technical path for the new generation of AI development," said Xu Bo, director of the Institute of Automation. "It might inspire the design of next-generation neuromorphic chips with lower power consumption."

Reported last year in Nature Communications, scientists from the institute, working with Swiss counterparts, developed an energy-efficient sensing-computing neuromorphic chip that mimics the neurons and synapses of the human brain.

The chip, dubbed "Speck," boasts an impressively low resting power consumption of just 0.42 milliwatts, meaning it consumes almost no energy when there is no input.

The human brain, capable of processing incredibly intricate and expansive neural networks, operates with a total power consumption of merely 20 watts, significantly lower than that of current AI systems.