Can AI Make Video Games More Immersive? Some Studios Turn to AI-Fueled NPCs for More Interaction

The AI (Artificial Intelligence) letters and robot hand are placed on computer motherboard in this illustration taken, June 23, 2023. (Reuters)
The AI (Artificial Intelligence) letters and robot hand are placed on computer motherboard in this illustration taken, June 23, 2023. (Reuters)
TT

Can AI Make Video Games More Immersive? Some Studios Turn to AI-Fueled NPCs for More Interaction

The AI (Artificial Intelligence) letters and robot hand are placed on computer motherboard in this illustration taken, June 23, 2023. (Reuters)
The AI (Artificial Intelligence) letters and robot hand are placed on computer motherboard in this illustration taken, June 23, 2023. (Reuters)

For decades, video games have relied on scripted, stilted interactions with non-player characters to help shepherd gamers in their journeys. But as artificial intelligence technology improves, game studios are experimenting with generative AI to help build environments, assist game writers in crafting NPC dialogue and lend video games the improvisational spontaneity once reserved for table-top role-playing games.

In the multiplayer game “Retail Mage,” players help run a magical furniture store and assist customers in hopes of earning a five-star review. As a salesperson — and wizard — they can pick up and examine items or tell the system what they'd like to do with a product, such as deconstruct chairs for parts or tear a page from a book to write a note to a shopper.

A player’s interactions with the shop and NPCs around them — from gameplay mechanics to content and dialogue creation — are fueled by AI rather than a predetermined script to create more options for chatting and using objects in the shop.

“We believe generative AI can unlock a new kind of gameplay where the world is more responsive and more able to meet players at their creativity and the things that they come up with and the stories they want to tell inside a fantasy setting that we create for them,” said Michael Yichao, cofounder of Jam & Tea Studios, which created “Retail Mage.”

The typical NPC experience often leaves something to be desired. Pre-scripted interactions with someone meant to pass along a quest typically come with a handful of chatting options that lead to the same conclusion: players get the information they need and continue on. Game developers and AI companies say that by using generative AI tech, they aim to create a richer experience that allows for more nuanced relationships with the people and worlds that designers build.

Generative AI could also provide more opportunities for players to go off-script and create their own stories if designers can craft environments that feel more alive and can react to players' choices in real-time.

Tech companies continue to develop AI for games, even as developers debate how, and whether, they’ll use AI in their products. Nvidia created its ACE technologies to bring so-called “digital humans” to life with generative AI. Inworld AI provides developers with a platform for generative NPC behavior and dialogue. Gaming company Ubisoft said last year that it uses Ghostwriter, an in-house AI tool, to help write some NPC dialogue without replacing the video game writer.

A report released by the Game Developers Conference in January found that nearly half of developers surveyed said generative AI tools are currently being used in their workplace, with 31% saying they personally use those tools. Developers at indie studios were most likely to use generative AI, with 37% reporting using the tech.

Still, roughly four out of five developers said they worry about the ethical use of AI. Carl Kwoh, Jam & Tea's CEO, said AI should be used responsibly alongside creators to elevate stories — not to replace them.

“That’s always been the goal: How can we use this tool to create an experience that makes players more connected to each other?” said Kwoh, who is also one of the company’s founders. “They can tell stories that they couldn’t tell before.”

Using AI to provide NPCs with endless things to say is “definitely a perk,” Yichao said, but "content without meaning is just endless noise." That's why Jam & Tea uses AI — through Google's Gemma 2 and their own servers in Amazon — to give NPCs the ability to do more than respond, he said. They can look for objects as they’re shopping or respond to other NPCs to add “more life and reactivity than a typically scripted encounter.”

“I’ve watched players turn our shopping experience into a bit of a dating sim as they flirt with customers and then NPCs come up with very realistic responses,” he said. “It’s been really fun to see the game react dynamically to what players bring to the table.”

Demonstrating a conversation with an NPC in the game “Mecha BREAK,” in which players battle war machines, Ike Nnole said that Nvidia has made its AI “humans” respond faster than they previously could by using small language models. Using Nvidia's AI, players can interact with the mechanic, Martel, by asking her to do things like customize the color of a mech machine.

“Typically, a gamer would go through menus to do all this,” Nnole, a senior product marketing manager at Nvidia said. “Now it could be a much more interactive, much quicker experience.”

Artificial Agency, a Canadian AI company, built an engine that allows developers to bring AI into any part of their game — not only NPCs, but also companions and “overseer agents” that can steer a player towards content they’re missing. The AI can also create tutorials to teach players a skill that they are missing so they can have more fun in-game, the company said.

“One way we like to put it is putting a game designer on the shoulder of everyone as they’re playing the game,” said Alex Kearney, cofounder of Artificial Agency. The company’s AI engine can be integrated at any stage of the game development cycle, she said.

Brian Tanner, Artificial Agency's CEO, said scripting every possible outcome of a game can be tedious and difficult to test. Their system allows designers to act more like directors, he said, by telling characters more about their motivation and background.

"These characters can improvise on the spot depending on what’s actually happening in the game,” Tanner said.

It's easy to run into a game's guardrails, Tanner said, where NPCs keep repeating the same phrase regardless of how players interact with them. But as AI continues to evolve, that will change, he added.

“It is truly going to feel like the world’s alive and like everything really reacts to exactly what’s happening," he said. “That’s going to add tremendous realism.”



Meta Unveils Cheaper VR Headset, AI Updates and Shows off Prototype for Holographic AR Glasses

Meta CEO Mark Zuckerberg wearing glasses (Reuters).
Meta CEO Mark Zuckerberg wearing glasses (Reuters).
TT

Meta Unveils Cheaper VR Headset, AI Updates and Shows off Prototype for Holographic AR Glasses

Meta CEO Mark Zuckerberg wearing glasses (Reuters).
Meta CEO Mark Zuckerberg wearing glasses (Reuters).

Meta unveiled updates to the company's virtual reality headset and Ray Ban smart glasses on Wednesday as it tries to demonstrate its artificial intelligence prowess and the next generation of computing platforms beyond smartphones and computers.
CEO Mark Zuckerberg also showed off Orion, a prototype he called “the most advanced glasses the world has ever seen.”
“The technical challenges to make them are insane,” Zuckerberg told a crowd of developers and journalists at Meta's Menlo Park, California, headquarters. The holographic augmented reality glasses, for one, needed to be glasses — not a bulky headset. There are no wires and they have to weigh less than 100 grams (3.5 ounces), among other things. And beyond interacting with your voice, typing or hand gestures, Orion has a “wrist-based neural interface” — it lets you send a signal from your brain to the device, using a wristband that translates nerve signals into digital commands.
There is no release date for Orion — Zuckerberg called it a “glimpse of the future.”
Seemingly in his element speaking to a cheering crowd, Zuckerberg said Meta is working to “bring the future to everyone” with its headsets, glasses and AI system. As part of an update to its Llama model, people will now be able to interact with Meta AI by speaking, with voices from celebrities such as John Cena, Judi Dench and Awkwafina.
“We are trying to build a future that is more open, more accessible, more natural, and more about human connection," Zuckerberg said. ”This is the continuation of the values and ideas that we have brought to the apps and technology that we have built over Meta’s first 20 years.”
An AI update aimed at influencers allows them to craft AI versions of themselves — for interacting with fans. On the keynote stage, an AI version of creator Don Allen Stevenson III appeared on the screen and answered a few questions just as the actual creator would. When Zuckerberg asked the AI creator about cattle ranching, it responded “my expertise lies in technology and design, not agriculture.” An earlier version of this tool was text only.
Other AI updates include live translation, which Zuckerberg demonstrated on stage. While wearing the smart glasses, Zuckerberg spoke in English to Mexican mixed martial artist Brandon Moreno replying in Spanish — the conversation was translated in real time. People can also dub their videos in another language so that it looks like they are speaking natively — even going so far as changing their lips movements to match.
Meta AI now has 500 million users, the company said. Jeremy Goldman of the research firm Emarketer called the number “jaw-dropping.”
“Meta has transformed from just a social media company into an AI powerhouse. Zuckerberg’s move to celebrity voices is not just for fun — it’s a direct challenge to OpenAI, with an emphasis on real-world utility," Goldman said.
Meta, which introduced the Quest 3 last year, also showed off a cheaper version of the VR goggles — the 3S — that will cost $299. The regular Quest 3 costs $499. The S3 will start shipping on Oct. 15.
“Meta is aggressively undercutting Apple’s Vision Pro to dominate the middle-tier AR/VR market,” Goldman said. Those VR goggles, which came out earlier this year after much anticipation, cost $3,500.
While VR goggles have grabbed more headlines, the augmented reality Ray Bans turned out to be a sleeper hit for Meta. The company hasn't disclosed sales numbers, but Zuckerberg said during Meta's July earnings call that the glasses “continue to be a bigger hit sooner than we expected — thanks in part to AI.” Zuckerberg said on Wednesday that Meta seems to have gotten past the supply issues that plagued the Ray Bans a few months ago due to high demand.
“They are kind of the perfect form factor for AI,” Zuckerberg said. The glasses, he added, let an AI assistant “see what you see, hear what you hear” and help you go about your day.
For instance, you can ask the glasses to remind you where you parked or to pick up groceries, look at a pile of fruit and come up with a smoothie recipe, or help you pick out a party outfit.
Meta — which renamed itself from Facebook in 2021, still makes nearly all of its money from advertising. In its most recent quarter, 98% of its more than $39 billion in revenue came from ads. At the same time, the company is investing heavily in AI and what Zuckerberg sees as the next generation of computing platforms such as VR headsets and AR glasses.
“VR headsets, despite Meta’s assertion, will not go mainstream," said Forrester research director Mike Proulx. “They’re too cumbersome, and people can only tolerate them in short bursts.”
Glasses, on the other hand “put computing power directly into a common and familiar form factor. As the smart tech behind these glasses matures, they have the potential to disrupt everyday consumers’ interactions with brands.”
Proulx said the Orion prototype "sets the stage for a future where a revolutionary 3D computing platform is within reach and can actually be useful to the everyday consumer.”