shisa.ai
Back to News
Research

Shisa.AI Develops Multilingual LLM with Industry-Leading Performance

Releasing an open-source 405B parameter model that surpasses GPT-4 in Japanese language tasks.

Shisa.AI Develops Multilingual LLM with Industry-Leading Performance

Shisa.AI is pleased to announce the official release of the "Llama 3.1 Shisa V2 405B," a multilingual LLM that sets a new standard for domestically produced models in Japan (press release). This model demonstrates Japanese language performance that surpasses GPT-4, establishing Japanese AI research as highly competitive on the global stage.

Leveraging our extensive expertise in LLM development, we have created new, high-quality Japanese training datasets optimized through rigorous experimentation. These datasets, along with the models themselves, are now available for download on Hugging Face under open-source licenses like Apache 2.0, permitting commercial use.

This release underscores Japan’s world-class capability in developing advanced language models. Shisa.AI remains committed to advancing LLM performance and efficiency, contributing to a robust, Japanese-centered multilingual AI infrastructure.

More from the newsroom