NVIDIA’s AI has already changed gaming, and it’s still just heating up

Pls share this post


Listen to this article

Since the introduction of GeForce RTX 20-Series GPUs and their onboard Tensor Cores, NVIDIA has been pushing forward the capabilities of AI in gaming. More features and more games are taking advantage of AI to deliver their game worlds and quality graphics. And with the latest NVIDIA ACE technology, it’s clear NVIDIA has plans to make AI even more integral to gaming than before.

The push into AI kicked off with NVIDIA DLSS, Deep Learning Super Sampling, which lets players enjoy high-fidelity visuals without the typical hit to frame rates that comes alongside them. NVIDIA achieved this by using higher-quality graphics as AI training data so that the Tensor Cores could take low-resolution gameplay and understand how to bridge the gap. Now, DLSS is multiplying resolution and frame rate with AI. 

READ ALSO  NYT Connections today — hints and answers for Monday, July 8 (game #393)

By the same token, NVIDIA Freestyle and RTX Remix provide tools for improving the visuals of games with AI. Freestyle lets users apply filters or even upconvert rendering into HDR for non-HDR games in realtime to enhance their experience. Meanwhile, RTX Remix can do for in-game assets what DLSS did for output resolution, letting modders take in asset files and smartly convert them to higher-quality assets to produce quick remasters of classic games.

While DLSS, NVIDIA Freestyle, and RTX Remix all alter the visual experience of games, NVIDIA ACE is set to change the way people interact with games themselves.

NVIDIA ACE is putting an AI toolset behind in-game NPCs (Non-player characters). Just as the power of large language models have brought chatbots to life in profound ways, NVIDIA ACE can provide a new level of dynamism for NPCs, letting gamers find more engagement throughout their games. 

READ ALSO  Star Wars: Battlefront Classic Collection gets swept up in more controversy as Aspyr may have used uncredited work from modders

NVIDIA ACE’s Riva speech and Audio2Face animation models allow seamless interaction between players and NPCs. Riva takes spoken prompts from players and transcribes them for the game. AI-powered NPCs respond in natural language that can be processed by Audio2Face to sync facial animations with the speech of the in-game character. Developers can run this whole pipeline from user input to NPC response in the cloud or even locally on the user’s PC. 

NVIDIA NeMo provides pretrained models and frameworks  for developers to create their own language models that can be used to create the brains behind NPCs. Choose from pre-trained language models called Nemotron and create guardrails (the safety system used to keep language models on track and appropriate) using NeMo Guardrails. The frameworks can be used to optimize, fine tune, and deploy these language models with ease on NVIDIA GPUs in the cloud. 

READ ALSO  PS5 digital vs. disc edition: Which PlayStation 5 is best for you?

As complicated as it might seem, AI has a big role to play in gaming going forward, and NVIDIA ACE is a part of that. To help you make sense of all the latest developments, NVIDIA has introduced the AI Decoded series, which you can check out to stay up to speed on what’s ahead at the intersection of gaming and AI. 

Source



Pls share this post
Previous articleApple’s Find My system is coming to Android
Next article
TeamGroup announces the ideal microSD card for the Steam Deck – but whether we’ll see it is another matter