In the previous edition of this newsletter we discussed some widely-shared examples of AI tech implemented atop existing games.
This week we’re double-clicking on one specific area of AI tech in games that’s made enormous strides in the last year: AI companions and agentic non-player characters (NPCs).
The rudimentary, script-based companions and NPCs that have long accompanied us on our gaming journeys are beginning to show their age, and a new era of fully interactive, intelligent non-player agents is dawning.
To understand the nuances of this coming change, we spoke with three developers on the cutting edge of companion and in-game agent tech: Jan Schnyder of nunu.ai, Nick Zak of Proxima, and Brian Cox of Inworld AI.
That story is below. But first, this week’s news from the future.
News From the Future
👑 The Game Awards announces Game of the Year 2024 nominees (Polygon)
The six nominees for the coveted Game of the Year award are the PlayStation critical darling Astro Bot, indie card game sensation Balatro, Chinese breakout megahit Black Myth: Wukong, Square-Enix’s long-awaited Final Fantasy 7 Rebirth, the dark horse RPG release Metaphor: ReFantazio, and—in a surprise and somewhat controversial twist—Elden Ring’s newest DLC, titled Shadow of the Erdtree.
👪 Roblox Changes Safety Systems and Parental Controls (GamesIndustry.biz)
Among the new changes: Parents and caregivers will be able to remotely view and manage their child's account, including by setting limits on spending and screen time, accessing their child's friend list, and monitoring their child's access to specific chat features. Kids under age 13 will no longer be able to use direct messaging on platform chats, and will be limited in what messages they can send in games and experiences.
📲 Final Fantasy XIV Is Coming To Mobile Devices (Gamespot)
Square Enix announced a partnership with Lightspeed Studios to adapt the popular MMORPG Final Fantasy XIV for mobile devices. The game doesn’t appear to be cross-compatible with the PC/console version, with Square Enix pitching it as a "sister" to the original game, recreating the story and combat mechanics on mobile devices. The game will first launch in China, with a global release planned shortly afterward.
👾 The State of Indie Games in 2024 and Beyond (SUPERJUMP)
Andrew Johnston at SUPERJUMP Magazine interviewed ~30 independent game developers about their perception of the state of biz, and found something interesting: “A majority said that the market was good for the kind of games that they make, but a similar majority said that the market was overall bad and getting worse.” A possible perception-reality gap? The post that follows digs into the data.
💰 Investing in Promise
Games and Entertainment are undergoing a profound transformation driven by rapid advancements in generative AI. At the forefront of this revolution is Promise, a new studio pioneering the use of generative AI to produce high-quality films, series, and to develop innovative new formats.
Colin Campbell and Andrew Chen explain more about our investment in Promise in our latest blog.
Since the birth of the video game industry, game developers have struggled to make ever-more convincing AI companions and enemies for players to interact with.
Space Invaders surprised players with waves of alien opponents that moved in unpredictable ways. Each of Pac-Man’s ghostly pursuers featured different movement logic, leading to thrilling chases.
By the 80s, game developers began wrapping particularly difficult enemy AIs in human likenesses, like Mike Tyson as the final boss in the 1987 NES game Mike Tyson's Punch-Out!!
In 2014 Mike Tyson attempted to defeat himself in Mike Tyson's Punch-Out!! live on The Tonight Show. He didn’t quite manage to make it eight rounds in this one.
As the industry evolved, AI grew more sophisticated. In Halo (2001) players charged into combat alongside AI-powered soldiers who could drive vehicles, take cover, and shout out their plans. In Half-Life 2 (2004), Alyx Vance won players’ hearts not just with her dialogue, but with her believable and dynamic reactions to both the player’s actions and to events occurring in the surrounding environment.
But even the most convincing in-game enemies, companions, and neutral NPCs have always been limited. No matter how sophisticated characters became, they were ultimately beholden to pre-written scripts, behavior trees, and triggers.
“These systems are rule-based,” says Brian Cox, Director of AI Gameplay Engineering at Inworld AI, “which means they rely on predefined patterns that make interactions feel consistent but can sometimes limit their depth.”
But now, thanks to advancements in generative AI and LLMs, there’s an opportunity to make AI companions that are much more intelligent and interactive.
This is a trend we on the A16Z GAMES team have become fascinated by in the past year—the first ever edition of this Substack introduced the PhDs at Altera and their next-gen Minecraft AIs. It seems we’re headed for a future where digital worlds are populated by convincingly human-like characters.
And we’re already beginning to get glimpses of that future in games like Cygnus Enterprises, a recent NetEase Games title that gives players a sassy, hilarious LLM-powered robot companion named PEA.
PEA is capable of responding to voice-driven commands from players, chatting casually about in-game lore, and even observing the environment to inform or warn players about the game state.
“I love working with robot companions because they allow us to explore quirky, exaggerated personalities while still staying within the realm of believability,” says Cox, who was the Lead Programmer on Cygnus Enterprises.
Cygnus Enterprises was an impressive start, but Cox has higher hopes for his future work at Inworld AI.
“True AI companions should feel like they have their own goals and intentions, not just as extensions of the player,” Cox says. “Right now, a lot of scripting goes into creating the illusion of autonomy, but with generative AI, I hope we’ll see companions that genuinely surprise players by taking unique approaches to problems or even challenging the player’s actions.”
The Next Level: Generalizable AI Agents
If the next milestone for next-gen AI agents in games are convincing characters, the level beyond that would be an AI agent that could learn how to play any game, just as a human does.
This is one of the ambitious goals currently being pursued by Jan Schnyder, co-founder at nunu.ai (disclosure: nunu.ai is an A16Z GAMES SPEEDRUN company).
The goal, as Schnyder tells it, is to build a framework made out of multiple models that can read text instructions like “go to the nearest tree and chop it” and convert that instruction into action in any game.
This sounds pretty difficult, and Schnyder admits it’s a challenge.
“Since our agents play the games like humans would (by looking at the screen and pressing keyboard and mouse buttons), we're limited in the actions it can take,” Schnyder says. For instance, he says, the models currently struggle with spatial reasoning, like understanding relative positions of objects or reading maps.
“Our current agent is like our little Frankensteins Monster consisting of various domain-specific sub-agents for each task,” Schnyder says. “In the future, as foundational models will get better, we believe (hope) that AI will get end-to-end, resulting in a much simpler architecture.”
Schnyder says that his team often encounters surprising forms of dysfunction while training AIs to play games. “Agent behavior HEAVILY depends on the underlying models,” he says. “One of the funniest examples we've seen was when we were trying to get a Claude-based agent to play Minecraft: One task was to kill a sheep to obtain wool, which it would straight-out refuse. It always insisted on using shears instead, as it found killing the sheep for wool unethical.”
But other experiments have gone better.
For instance, nunu.ai’s agents function surprisingly well in physical, robot bodies after being trained in game worlds. (See this video of Rob the Robot.)
“Starting in gaming gives us a massive head start,” Schnyder says. “Game environments allow us to build agents and perform actions for which we would have to wait several years still to do in the real world. This is because we can ‘tweak’ physics and environment in a way that helps the agents out in areas where it's simply not good enough yet.”
Schnyder hopes that with this virtual-to-real training process, more progress will soon be made in areas of robotics like complex object manipulation.
“Picking up an object in a game is very easy—you literally just press the interact button,” Schnyder says, “but picking up on an object in real life is very hard. There literally does not exist a robust enough low level manipulation policy yet that can pick up an kind of objects for robotics. This is a massive bottle neck for so many trajectories in robotics.”
Sidebar: Suck Up! and the Improv Vampires
By connecting in-game NPCs to large language models, they’re freed from the restrictive limitations of old-school dialogue trees, and more improvisational possibilities emerge.
There’s perhaps no better example of this in action than Suck Up! a game where you’re a vampire trying to trick innocent LLM-powered townsfolk into welcoming you into their homes. The game uses the player’s voice as the input method, and there are no real guardrails on what you can say. Instead, you’ve just got to make it up as you go. The result has been a ton of hilarious YouTube videos from creators playing the game.
“I’ve personally watched or joined over a thousand streams, and I’m always struck by how distinct each one feels,” says Nick Zak, Head of Creative and Game Design at Proxima, which created Suck Up! “Whether it’s their approach of costume choices, or their initial lines, or the quirky personalities of the in-game neighbors they encounter, it’s this “yes-and” improv element that brings out the innate humor of the game and makes every experience feel fresh.”
Another area where the nunu.ai team has made progress is with AI interpretability—the practice of making the “thought process” of an AI agent more legible to players or viewers.
For human witnesses, next-gen AI behavior can feel confusing, or even unnerving. Why are they taking their chosen actions? What’s going on under the hood?
This is an area that Schnyder is hopeful about. What if instead of simply explaining their reasoning, agents were able to act out different character personas and communicate emulated emotions appropriately?
In the future, Schnyder says, “the agents will be able to tell you whether parts are scary, frustrating or fun,” he says. “Games are about feelings.”
That’s the hope, at least: That the better AI tech gets, the more capable games will be of introducing us to deeper feeling, less robotic characters.
That’s it for this week. Want more pieces like this one, or have questions that you’d like to see A16Z GAMES tackle?
Write in to us at games-content@a16z.com
▶️ A Brief History of the NPC
Interested in learning more about agentic AI? Our own Lester Chen dove deeper on the topic in this short video on the A16Z GAMES channel.
💼 Jobs Jobs Jobs
There are currently over 100 open jobs listings across both the A16Z GAMES portfolio and our SPEEDRUN portfolio. For the freshest games industry jobs postings, be sure to follow our own Caitlin Cooke and Jordan Mazer on LinkedIn.
Join our talent network for more opportunities. If we see a fit for you, we'll intro you to relevant founders in the portfolio.