Editor’s note: This post is part of the AI Decoded series, which demystifies AI by making the technology more accessible, and showcases new hardware, software, tools and accelerations for RTX PC and workstation users.
At Gamescom this week, NVIDIA announced that NVIDIA ACE — a suite of technologies for bringing digital humans to life with generative AI — now includes the company’s first on-device small language model (SLM), powered locally by RTX AI.
The model, called Nemotron-4 4B Instruct, provides better role-play, retrieval-augmented generation and function-calling capabilities, so game characters can more intuitively comprehend player instructions, respond to gamers, and perform more accurate and relevant actions.
The SLM Advantage
An AI model’s accuracy and performance depends on the size and quality of the dataset used for training. Large language models are trained on vast amounts of data, but are typically general-purpose and contain excess information for most uses.
SLMs, on the other hand, focus on specific use cases. So even with less data, they’re capable of delivering more accurate responses, more quickly — critical elements for conversing naturally with digital humans.
Nemotron-4 4B was first distilled from the larger Nemotron-4 15B LLM. This process requires the smaller model, called a “student,” to mimic the outputs of the larger model, appropriately called a “teacher.” During this process, noncritical outputs of the student model are pruned or removed to reduce the parameter size of the model. Then, the SLM is quantized, which reduces the precision of the model’s weights.
ACEs Up
ACE NIM microservices allow developers to deploy state-of-the-art generative AI models through the cloud or on RTX AI PCs and workstations to bring AI to their games and applications. With ACE NIM microservices, non-playable characters (NPCs) can dynamically interact and converse with players in the game in real time.
ACE consists of key AI models for speech-to-text, language, text-to-speech and facial animation. It’s also modular, allowing developers to choose the NIM microservice needed for each element in their particular process.
To Infinity and Beyond
Digital humans go far beyond NPCs in games. At last month’s SIGGRAPH conference, NVIDIA previewed “James,” an interactive digital human that can connect with people using emotions, humor and more. James is based on a customer-service workflow using ACE.
Changes in communication methods between humans and technology over the decades eventually led to the creation of digital humans. The future of the human-computer interface will have a friendly face and require no physical inputs.
Digital humans drive more engaging and natural interactions. According to Gartner, 80% of conversational offerings will embed generative AI by 2025, and 75% of customer-facing applications will have conversational AI with emotion. Digital humans will transform multiple industries and use cases beyond gaming, including customer service, healthcare, retail, telepresence and robotics.
Users can get a glimpse of this future now by interacting with James in real time at ai.nvidia.com.
Generative AI is transforming gaming, videoconferencing and interactive experiences of all kinds. Make sense of what’s new and what’s next by subscribing to the AI Decoded newsletter.
- 0 Comments
- Ai Process
- Artificial Intelligence