Rakuten Receives Open Innovation Field IT Award from Japan Institute of Information Technology’s 42nd IT Awards
- Rakuten recognized for the development of Rakuten AI 7B, a high-performance, open large language model optimized for Japanese
Tokyo, December 9, 2024 - Rakuten Group, Inc. today announced that it has been awarded the Information Technology Award (Open Innovation Field) in the Japan Institute of Information Technology’s 42nd IT Awards in 2024 for its development of the Rakuten AI 7B large language model (LLM), optimized for the Japanese language.
The IT Awards are presented by the Japan Institute of Information Technology to enterprises, organizations, groups and individuals whose outstanding efforts in driving business innovation have produced results through advanced IT applications. Award recipients have demonstrated success in creating new business opportunities, establishing effective business models and improving productivity across Japan’s industry and government institution operations.
Under its AI-nization initiative, Rakuten is integrating AI into every aspect of its business to augment human creativity, accelerate productivity and drive further growth. Going forward, Rakuten aims to continue leveraging rich data and cutting-edge AI technology to add value and enrich the lives of people around the world.
Overview of the Award-Winning Project:
Development of Rakuten AI 7B – Building a High-Performance, Japanese-Optimized Open LLM
Project details: Rakuten has developed Rakuten AI 7B, a 7 billion parameter Japanese language foundation model, optimized specifically for the Japanese language. Rakuten AI 7B Instruct is an instruct model fine-tuned from the Rakuten AI 7B foundation model. Rakuten AI 7B Chat is a chat model based on a version of the instruct model and fine-tuned on chat data for conversational text generation. All models were released as open models on March 21, 2024.
Rakuten AI 7B was developed by continuous training on Mistral-7B-v0.1, an open LLM from France-based AI company Mistral AI. The foundation and instruct models earned top average performance scores among open Japanese LLMs via LM Evaluation Harness and Rakuten AI 7B has been downloaded 74,000 times as of December 2024.
Reasons for award: Rakuten was recognized for its contributions in developing high-quality LLMs by leveraging years of experience and data from e-commerce-specific text processing, utilizing a proprietary tokenizer and refining LLM training datasets. The lightweight architecture of Rakuten AI 7B, with 7 billion parameters, was also noted for addressing recent concerns about the significant energy consumption of AI operations.
global.rakuten.com/corp/news/update/2024/1209_01.html