With the advent of ChatGPT, many wondered how artificial intelligence can be integrated into video games. Idea to enter into natural conversation with NPC skyrim sounds appealing, especially considering modern RPG interactions are limited. Today NVIDIA took a step in that direction with ACE for gamesa service that integrates the power of generative AI into next-generation games.

At Computex, the tech giant introduced the Avatar Cloud Engine (ACE), a service that combine three AI models to bring the characters to life from a video game. ACE for games allows engage in natural language voice conversations with an NCP (Non-Player Character) who will respond briefly. This new technology will improve immersion and open up new interactions for users.

NVIDIA demonstrated ACE for Games using Unreal Engine 5 and MetaHumans from Epic Games. Karios is a first-person demo set in a futuristic restaurant where the player (Kai) walks up to the counter to talk to Jin, the owner of the establishment. Kai asks a series of questions using natural language, which the NPC coherently answers and assigns a quest.

Demo uses three AI models optimized for voice, conversation and character animation.

  • NeMo is a technology for creating and implementing custom language models using proprietary data.
  • Riva, speech recognition and text-to-speech artificial intelligence for real-time conversation
  • Omniverse Audio2Face to create character facial animation and match it to the voice track.

NVIDIA says generative AI could revolutionize video games

Cyberpunk 2077 may include AI to make it more immersive

The ACE for Games demo is impressive and gives you an idea of ​​what we can experience in next generation games. “Generative AI has the potential to revolutionize interactivity and dramatically increase game immersion,” said John Spitzer, NVIDIA vice president of product development.

One of the features of this technology is that can be implemented in popular game engines such as Unreal Engine 5 or Unity. According to its creators, the toolbox is optimized for latency, so belated responses won’t lessen the immersion effect. The tech giant said that neural networks allow its tools to be optimized for different scenarios.

The Million Dollar Question:how would it work in a real scenario“. A demo like Kairos was developed in a controlled environment and the interactions are defined. It’s a fact that the average player will ask a number of more questions that have nothing to do with the game. Some will try to get around the security rules to confuse the AI ​​people, like we saw in Bing.

The danger of unwanted conversations

Given this, NVIDIA says that developers can secure interactions with NeMo Guardrails. This open source software adds barriers to AI with rules and patterns so conversations stay on topic. In the end, the creators will be responsible for determine how far players can go starting from a predetermined base.

There is still a long way to go before the advent of games that allow interaction like NVIDIA’s. ACE for games requires local processing (PC with GeForce RTX graphics card) or cloud connection, so the experience will not be the same for everyone. In addition, regulation of AI is inevitable in many countries, which will determine the scale and adoption of this technology in the future.

Source: Hiper Textual

Previous articleArm introduces 5th generation mobile GPU that is 40% more efficient
Next articleFoxconn triples employee incentives to meet iPhone 15 demand
I am Bret Jackson, a professional journalist and author for Gadget Onus, where I specialize in writing about the gaming industry. With over 6 years of experience in my field, I have built up an extensive portfolio that ranges from reviews to interviews with top figures within the industry. My work has been featured on various news sites, providing readers with insightful analysis regarding the current state of gaming culture.

LEAVE A REPLY

Please enter your comment!
Please enter your name here