Le Lézard
Classified in: Science and technology
Subjects: Photo/Multimedia, Event, Product/Service

Shanghai Stonehill Technology Unveils the First Non-Attention-Based Large Model in China: Faster, Stronger, More Economical


On January 24th, at the "New Architecture of Large Language Model", Rock AI (a subsidiary of Shanghai Stonehill Technology Co., Ltd.) officially unveiled the first domestic general-purpose large language model without an Attention mechanism?the Yan Model. It is also one of the rare large models in the industry that does not rely on a Transformer architecture. The Yan Model offers a training efficiency that is 7 times higher than that of Transformer models with equivalent parameters, 5 times the inference throughput, and 3 times the memory capacity. Additionally, it supports lossless operation on CPUs, reduced hallucination in expressions, and 100% support for private deployment applications.

At the meeting, Liu Fanping, the CEO of Rock AI delivered a speech: "We hope that the Yan architecture can serve as the infrastructure for the artificial intelligence field, and to establish a developer ecosystem in the AI domain. Ultimately, we aim to enable anyone to use general-purpose large models on any device, providing more economical, convenient, and secure AI services, and to promote the construction of an inclusive artificial intelligence future."

The Transformer, as the foundational architecture for large models such as ChatGPT, has achieved significant success, but it still has many shortcomings, including high computational power consumption, extensive memory usage, high costs, and difficulties in processing long sequence data. To address these issues, the Yan Model replaces the Transformer architecture with a newly developed generative "Yan Architecture" of its own. This architecture enables lossless inference of infinitely long sequences on consumer-grade CPUs, achieving the performance effects of a large model with hundreds of billions of parameters using only tens of billions of parameters, and meets the practical needs of enterprises for low-cost, easy deployment of large models.

At the press conference, the research team presented a wealth of empirical comparisons between the Yan Model and a Transformer model of the same parameter scale. The experimental data showed that under the same resource conditions, the Yan architecture's model has a training efficiency and inference throughput that are respectively 7 times and 5 times higher than those of the Transformer architecture, and its memory capacity is improved by 3 times. In response to the long-sequence challenge faced by the Transformer, the Yan Model also performs excellently, theoretically capable of achieving inference of unlimited length.

Additionally, the research team has pioneered a reasonable associative feature function and memory operator, combined with linear computation methods, to reduce the complexity of the model's internal structure. The newly architected Yan Model will attempt to open up the previously "uninterpretable black box" of natural language processing, aiding the widespread application of large models in high-risk areas such as healthcare, finance, and law. At the same time, the hardware advantage of the Yan Model, which can run on mainstream consumer-grade CPUs without compression or pruning, also significantly broadens the possibilities for large models to be deployed across various industries.

Liu Fanping stated, "In the next phase, Rock AI aims to create a full-modality real-time human-computer interaction system, achieve end-side training, and integrate training and inference. We plan to fully connect perception, cognition, decision-making, and action to construct an intelligent loop for general artificial intelligence. This will provide more options for the foundational platform of large models in research areas such as general-purpose robots and embodied intelligence."


These press releases may also interest you

at 16:20
Today, NJOY, an Altria company, announces the submission of a supplemental Premarket Tobacco Product Application (PMTA) to the U.S. Food and Drug Administration (FDA) to commercialize and market the NJOY ACE 2.0 device. This new device incorporates...

at 16:15
Rubicon Technologies, Inc. ("Rubicon" or the "Company") , a leading provider of technology solutions for waste and recycling generators, today reported financial and operational results for the first quarter of 2024. First Quarter 2024 Financial...

at 16:15
American Public Education, Inc. , a portfolio of education companies providing online and campus-based postsecondary education and career learning to over 125,000 students through four subsidiary institutions, today announced that management will be...

at 16:15
Remark Holdings, Inc. , a leading provider of artificial intelligence solutions, today announced its financial results for its quarter ended March 31, 2024. For complete details of the consolidated financial statements and accompanying management's...

at 16:15
South Atlantic Bancshares, Inc. ("South Atlantic" or the "Company") , parent of South Atlantic Bank (the "Bank"), today announced that the board of directors of the Company (the "Board") has authorized a stock repurchase program for up to 380,341 of...

at 16:10
Pulse Biosciences, Inc. ("Pulse" or the "Company"), a company leveraging its novel and proprietary CellFX® Nanosecond Pulsed Field Ablationtm (nsPFAtm) technology, today announced that the Company's Board of Directors has determined a new record...



News published on and distributed by: