The NEC Corporation has revealed its new artificial intelligence (AI) business strategy, centered around the enhancement and expansion of its lightweight large language model (LLM). The firm aims to provide an environment optimised for the usage of generative AI, designed to be individually customised for each customer’s specific business needs, using its industry knowledge and business expertise.
The application of these services is anticipated to considerably increase the scope for operations transformation across numerous sectors, such as healthcare, finance, local governments, and manufacturing. In addition to this, NEC will concentrate its efforts on building specialist models designed to stimulate business transformation and encourage the adoption of generative AI across entire industries via managed application programming interfaces (API) services.
By doubling the quantity of high-quality training data, NEC has improved its LLM's performance, which has been confirmed to outpace a collection of leading LLMs both domestically and internationally, according to a comparative assessment of Japanese dialogue skills (Rakuda). Furthermore, the LLM can process up to 300,000 Japanese characters – 150 times the length that third-party LLMs can handle, making it perfect for a multitude of operations that require large amounts of documents, like internal and external business manuals.
NEC is also working on a "new architecture" that will create innovative AI models by combining models flexibly based on the input data and tasks involved. The goal of utilising this architecture is to develop a scalable foundation model that can enlarge its parameters and extend its capabilities. This model will be able to connect with varying AI models, including specialised legal or medical AI, and models from other businesses and partners, without compromising its performance. Its small size and low power consumption allow for easy installation into edge devices. Moreover, the combination of NEC's globally recognised image recognition, audio processing, and sensing technologies allow the LLMs to process a multitude of real-world events accurately and autonomously.
In tandem with this, NEC has also kickstarted the development of a large-scale model that contains 100 billion parameters, a substantial leap from the conventional 13 billion parameters. NEC's primary goal with these initiatives is to achieve sales of approximately 50 billion yen from its generative AI-related business over the forthcoming three years.
Despite the rapid acceleration of generative AI's development and usage, many challenges remain surrounding its utilisation. Among these are immediate engineering to accurately instruct AI, security concerns such as data leakage and vulnerability, coordination of business data during implementation and operation. That said, since July 2023, NEC has been utilising the NEC Inzai Data Centre, which allows for a low-latency, secure LLM environment, and the construction of customer-specific "individual company models" and "business-specific models" through NEC-developed LLMs.
Going forward, NEC aims to provide the best solutions in a multitude of industries by offering an LLM that comprises a scalable foundation model and an optimal environment for using generative AI, specifically tailored to the customer's business. Alongside this, NEC continues to work with its customers to augment value, including service and function expansion, and to provide safe and secure generative AI services and solutions for resolving customer problems.