ALL THE LATEST NEWS ABOUT THE BUSINESS OF PC GAMES

Feature

Raytracing, machine learning and social - The future of PC games according to Nvidia

Raytracing, machine learning and social - The future of PC games according to Nvidia

Nvidia has been incredibly busy.

The hardware firm has been working incredibly hard to push the envelope when it comes to what's possible in the games development space.

At GDC this year, the GPU specialist was showing off tech that allowed for real-time raytracing. Though this has long been in use in the film industry, being able to render in this way is pretty much the Holy Grail when it comes to video games graphics.

Raytracing allows developers to simulate how light actually works in real life. While that sounds simple, it actually requires a massive amount of processing power to compute individual rays of light. Until now, game makers have been using hacks like rasterisation to get the job done. But now, studios won't have to use these sorts of workarounds to get realistic-looking shadows, to pick one example.

"We've been moving towards real-time which has been about the amount of performance available to do it," European developer relations chief Phil Scott tells PCGamesInsider.biz.

"We're reaching a point where the performance is there to do it. There's a lot of understanding about how to do it well and a lot of movies made - high-quality bar films - that have been shown it's possible. It's been the next step. Plus, raytracing actually solves one of the big problems. Typically, rendering is a bit of a hack."

Predictably the reaction from the development community has been rather strong. This is a massive leap when it comes to what's possible with graphics.

"Moving from rasterisation to raytracing is quite the jump," Scott explains.

"Games take quite a long time. Done are the days when games took three months to make. Now it's three or four years. For people to be able to make real decisions about how something is made - pre-production, production, post-production - then they're also looking at the extended life of a game with DLC and everything else, everybody is looking at how to get ahead of the competition, what can they do that gives them a leg up or makes a difference. People are looking at it and working on it."

In order to demo real-time raytracing, Epic used Captain 'Shiniest Thing in a galaxy far, far away' Phasma from Star Wars

Raytracing isn't the only new bit of tech to come along that's going to make developers' lives much, much easier. Nvidia is also pushing machine learning which has more universal applications when it comes to making games in that, well, it can very literally be applied to anything.

"It's by far the most exciting time since I joined Nvidia. Raytracing is coming. People are excited about that, as well as machine learning and what that enables," developer relations director Mike Smith says.

"Machine learning is not new, either. It's been around a long time but things have changed - access to data and GPU performance that enables the rapid training that's necessary. Really there are two aspects of machine learning that I think are interesting. There's the development side of building games - that's going to help developers build games faster. They can spend more time on new gameplay features or the graphical fidelity. The faster that developers can make games, the cheaper they can make them the more great games will be around to be enjoyed.

"Then there's the real-time usage for machine learning in games. That's anything from facial animation, text-to-speech, dynamic quest generation - there's a million different things that developers think about and use cases for it. It's still very early but it's a super exciting time because people are thinking of all these ways that machine learning can help them enable gameplay features real-time in a game."

Machine learning is going to have a massive impact on the way games are made and will, in theory, allow smaller studios to compete at a higher level. This is because machine learning AIs can do work much faster than humans. For example, in a presentation at GDC, Ubisoft claimed that it had trained an AI to clean up motion capture data. A session that would have taken a human four hours to tidy took the machine learning algorithm just four minutes.

"It'll allow them to make more games - it'll allow smaller developers to make games that look even better," Scott says.

"It just helps the industry grow. As consumers, we get to enjoy all that great content. This will definitely help smaller developers. Facial animation is one example - as a small studio, you have to make trade-offs. Maybe you won't do facial animation because you don't have the resources to pay someone to do it. If there's a system that helps turn it from weeks of effort to hours of effort, that changes. You might want to do facial animation. Then there's things like text-to-speech, translation into different languages. There are are a lot of different areas that are going help all developers. It's going to help them raise the bar."

Hellblade is one of the titles that can use Nvidia's Ansel tech

Thirdly and finally, the last piece of tech Nvidia is pushing is, in fact, an addition to its GeForce Experience offering. This package allows users to record gameplay footage, highlights and so on. And now the GPU giant has launched a sort-of 'Instagram-for-games' in Ansel, functionality which sees consumers able to pause the game, frame the perfect shot, add filters and share it on the internet.

"Most people have some kind of creative itch that they want to scratch. If they can't make games, the success of something like Minecraft where people were able to build things showed that," Scott says.

"Being able to take photographs of something that people do - and a lot of people get quite arsty about it. Just look at Instagram; people like to apply filters, they like to play with the images. That's one of the things that we've brought into Ansel. People are able to pause the world, something you normally can't do, frame the perfect shot. You can literally be a world class photographer. Some of the work you see that people do with games photography is stunning. That's real game content, it's not fake. It really just shows off the amount of effort that people put into games."

At the moment, Ansel is available with just a handful of games, such as Star Wars: Battlefront II and Ninja Theory's Hellblade. Asked what Nvidia's pitch to developers was to use this, the team says that it's more often studios just asking how to get the tech in their game.

Using this functionality, Scott says, is also a great way to ensure that as many people as possible know about your game - both on launch and after.

"There's a whole aspect of it where studios see it as a way to engage the users beyond a game's initial life," he explains.

"If people are continually posting content from games - all of a sudden of DLC comes out, they're posting content about it. It builds a social platform almost of content that's going here, there and everywhere. One of the newest features was gif export. That just opens up a whole world of Twitch and Reddit where people love gifs. It was probably the most requested feature by far. It was almost like an avalanche of requests."

PCGamesInsider Contributing Editor

Alex Calvin is a freelance journalist who writes about the business of games. He started out at UK trade paper MCV in 2013 and left as deputy editor over three years later. In June 2017, he joined Steel Media as the editor for new site PCGamesInsider.biz. In October 2019 he left this full-time position at the company but still contributes to the site on a daily basis. He has also written for GamesIndustry.biz, VGC, Games London, The Observer/Guardian and Esquire UK.