{"id":11924,"date":"2022-12-20T15:15:01","date_gmt":"2022-12-20T15:15:01","guid":{"rendered":"https:\/\/blog.bwgamespot.com\/index.php\/2022\/12\/20\/nvidias-ai-generated-photo-based-graphics-will-blow-your-tiny-mind\/"},"modified":"2022-12-20T15:15:01","modified_gmt":"2022-12-20T15:15:01","slug":"nvidias-ai-generated-photo-based-graphics-will-blow-your-tiny-mind","status":"publish","type":"post","link":"https:\/\/blog.bwgamespot.com\/index.php\/2022\/12\/20\/nvidias-ai-generated-photo-based-graphics-will-blow-your-tiny-mind\/","title":{"rendered":"Nvidia&#8217;s AI-generated photo-based graphics will blow your tiny mind"},"content":{"rendered":"<div class=\"youtube-video\">\n<div class=\"video-aspect-box\"><\/div>\n<\/div>\n<p>If you thought Nvidia&#8217;s latest <a href=\"https:\/\/www.pcgamer.com\/nvidia-dlss-3-performance\/\" target=\"_blank\" rel=\"noopener\">DLSS 3<\/a> frame-rate speedifyin&#8217; technology was pretty trick, you ain&#8217;t, as they say, seen nothin&#8217;. May we introduce you to<a href=\"https:\/\/www.youtube.com\/watch?v=aQctoORQwLE\" target=\"_blank\" rel=\"noopener\"> NeRFs or Neural Radiance Fields,<\/a> a terrifyingly clever AI-accelerated method for generating full 3D scenes from a handful of photos.<\/p>\n<p>The idea is conceptually simple. Take a few 2D images of something. Run it through some AI algorithms. Render the thing or things \u2014 or an entire complex scene \u2014 in full 3D. And in real time.<\/p>\n<p>It&#8217;s similar to photo scanning to create 3D scenes but with added AI goodness. That&#8217;s the sales pitch, at least. NeRFs aren&#8217;t unique to Nvidia. But the company is doing some particularly interesting things with NeRFs that are relevant to PC gaming as opposed to movie making, which is where a lot of the noise around NeRFs has been so far.<\/p>\n<p>You can check out this in-depth video from <a href=\"https:\/\/www.youtube.com\/watch?v=YX5AoaWrowY\" target=\"_blank\" rel=\"noopener\">Corridor Crew<\/a> earlier this year showing just how incredible NeRF technology is for creating photorealistic video from just a few photos. But a word of warning. Once you&#8217;ve seen it, it&#8217;ll have you doubting if anything you see in video or film is real or re-rendered with NeRFs. It is <em>spooky<\/em>.<\/p>\n<p>Indeed, there are even NeRF-based apps available for Android and Apple smartphones, such as Luma AI, so you can try the technology out for yourself.<\/p>\n<p>Getting back to the PC, apart from the whole process being freakishly clever, it offers a number of immediate and obvious advantages. Instead of needing detailed 3D models and lots of bandwidth-heavy high-res textures, all it takes to render a complex scene is a few photos or images.<\/p>\n<div class=\"image-full-width-wrapper\">\n<div class=\"image-widthsetter\">\n<p class=\"vanilla-image-block\">\n<\/p><\/div>\n<\/div>\n<p><span class=\"caption-text\">Nvidia is implementing NeRFs with dramatically lower data usage. <\/span><span class=\"credit\">(Image credit: Nvidia)<\/span><\/p>\n<p>Nvidia is taking that to new extremes so-called &#8220;compact&#8221; NeRFs that require in the region of <a href=\"https:\/\/nv-tlabs.github.io\/vqad\/\">100 times less data to render a scene<\/a>. The benefits that can have in terms of bandwidth and storage are obvious enough.<\/p>\n<div class=\"fancy-box\">\n<div class=\"fancy_box-title\">Your next upgrade<\/div>\n<div class=\"fancy_box_body\">\n<div class=\"image-full-width-wrapper\">\n<div class=\"image-widthsetter\">\n<p class=\"vanilla-image-block\">\n<\/p><\/div>\n<\/div>\n<p><span class=\"credit\">(Image credit: Future)<\/span><\/p>\n<p><a href=\"https:\/\/www.pcgamer.com\/best-cpu-for-gaming\/\" target=\"_blank\" rel=\"noopener\"><strong>Best CPU for gaming<\/strong><\/a>: The top chips from Intel and AMD<br \/>\n<a href=\"https:\/\/www.pcgamer.com\/best-gaming-motherboards\/\" target=\"_blank\" rel=\"noopener\"><strong>Best gaming motherboard<\/strong><\/a>: The right boards<br \/>\n<a href=\"https:\/\/www.pcgamer.com\/the-best-graphics-cards\/\" target=\"_blank\" rel=\"noopener\"><strong>Best graphics card<\/strong><\/a>: Your perfect pixel-pusher awaits<br \/>\n<a href=\"https:\/\/www.pcgamer.com\/best-ssd-for-gaming\/\" target=\"_blank\" rel=\"noopener\"><strong>Best SSD for gaming<\/strong><\/a>: Get into the game ahead of the rest<\/p>\n<\/div>\n<\/div>\n<p>However, previous implementations of NeRF technology required significant time to train and process the AI model. Nvidia&#8217;s &#8220;instant&#8221; NeRFs, by contrast, can do the whole process in seconds.<\/p>\n<p>All of this is obviously pointing in a pretty tantalising direction, namely using NeRFs for real-time game rendering. Imagine the potential bandwidth and performance replacing all that model, texture and lighting data for a game scene with a few images. And then add the kicker that the results look more realistic than conventional game engines and art.<\/p>\n<p>Listen, we&#8217;re not saying this stuff is just around the corner. But NeRF technology is evolving incredibly fast. The concept only emerged around two years ago and already it&#8217;s generating incredible results. Whether it will revolutionise PC gaming, well, time will tell. While we wait, you can deep dive into Nvidia&#8217;s research paper on the subject <a href=\"https:\/\/nv-tlabs.github.io\/vqad\/\" target=\"_blank\" rel=\"noopener\">here<\/a>.<\/p>","protected":false},"excerpt":{"rendered":"<p>[#item_image]Nvidia&#8217;s AI-generated photo-based graphics will blow your tiny mind<!-- wp:html --><\/p>\n<div class=\"youtube-video\">\n<div class=\"video-aspect-box\"><\/div>\n<\/div>\n<p>If you thought Nvidia&#8217;s latest <a href=\"https:\/\/www.pcgamer.com\/nvidia-dlss-3-performance\/\" target=\"_blank\" rel=\"noopener\">DLSS 3<\/a> frame-rate speedifyin&#8217; technology was pretty trick, you ain&#8217;t, as they say, seen nothin&#8217;. May we introduce you to<a href=\"https:\/\/www.youtube.com\/watch?v=aQctoORQwLE\" target=\"_blank\" rel=\"noopener\"> NeRFs or Neural Radiance Fields,<\/a> a terrifyingly clever AI-accelerated method for generating full 3D scenes from a handful of photos.<\/p>\n<p>The idea is conceptually simple. Take a few 2D images of something. Run it through some AI algorithms. Render the thing or things \u2014 or an entire complex scene \u2014 in full 3D. And in real time.<\/p>\n<p>It&#8217;s similar to photo scanning to create 3D scenes but with added AI goodness. That&#8217;s the sales pitch, at least. NeRFs aren&#8217;t unique to Nvidia. But the company is doing some particularly interesting things with NeRFs that are relevant to PC gaming as opposed to movie making, which is where a lot of the noise around NeRFs has been so far.<\/p>\n<p>You can check out this in-depth video from <a href=\"https:\/\/www.youtube.com\/watch?v=YX5AoaWrowY\" target=\"_blank\" rel=\"noopener\">Corridor Crew<\/a> earlier this year showing just how incredible NeRF technology is for creating photorealistic video from just a few photos. But a word of warning. Once you&#8217;ve seen it, it&#8217;ll have you doubting if anything you see in video or film is real or re-rendered with NeRFs. It is <em>spooky<\/em>.<\/p>\n<p>Indeed, there are even NeRF-based apps available for Android and Apple smartphones, such as Luma AI, so you can try the technology out for yourself.<\/p>\n<p>Getting back to the PC, apart from the whole process being freakishly clever, it offers a number of immediate and obvious advantages. Instead of needing detailed 3D models and lots of bandwidth-heavy high-res textures, all it takes to render a complex scene is a few photos or images.<\/p>\n<div class=\"image-full-width-wrapper\">\n<div class=\"image-widthsetter\">\n<p class=\"vanilla-image-block\">\n<\/div>\n<\/div>\n<p><span class=\"caption-text\">Nvidia is implementing NeRFs with dramatically lower data usage. <\/span><span class=\"credit\">(Image credit: Nvidia)<\/span><\/p>\n<p>Nvidia is taking that to new extremes so-called &#8220;compact&#8221; NeRFs that require in the region of <a href=\"https:\/\/nv-tlabs.github.io\/vqad\/\">100 times less data to render a scene<\/a>. The benefits that can have in terms of bandwidth and storage are obvious enough.<\/p>\n<div class=\"fancy-box\">\n<div class=\"fancy_box-title\">Your next upgrade<\/div>\n<div class=\"fancy_box_body\">\n<div class=\"image-full-width-wrapper\">\n<div class=\"image-widthsetter\">\n<p class=\"vanilla-image-block\">\n<\/div>\n<\/div>\n<p><span class=\"credit\">(Image credit: Future)<\/span><\/p>\n<p><a href=\"https:\/\/www.pcgamer.com\/best-cpu-for-gaming\/\" target=\"_blank\" rel=\"noopener\"><strong>Best CPU for gaming<\/strong><\/a>: The top chips from Intel and AMD<br \/>\n<a href=\"https:\/\/www.pcgamer.com\/best-gaming-motherboards\/\" target=\"_blank\" rel=\"noopener\"><strong>Best gaming motherboard<\/strong><\/a>: The right boards<br \/>\n<a href=\"https:\/\/www.pcgamer.com\/the-best-graphics-cards\/\" target=\"_blank\" rel=\"noopener\"><strong>Best graphics card<\/strong><\/a>: Your perfect pixel-pusher awaits<br \/>\n<a href=\"https:\/\/www.pcgamer.com\/best-ssd-for-gaming\/\" target=\"_blank\" rel=\"noopener\"><strong>Best SSD for gaming<\/strong><\/a>: Get into the game ahead of the rest<\/p>\n<\/div>\n<\/div>\n<p>However, previous implementations of NeRF technology required significant time to train and process the AI model. Nvidia&#8217;s &#8220;instant&#8221; NeRFs, by contrast, can do the whole process in seconds.<\/p>\n<p>All of this is obviously pointing in a pretty tantalising direction, namely using NeRFs for real-time game rendering. Imagine the potential bandwidth and performance replacing all that model, texture and lighting data for a game scene with a few images. And then add the kicker that the results look more realistic than conventional game engines and art.<\/p>\n<p>Listen, we&#8217;re not saying this stuff is just around the corner. But NeRF technology is evolving incredibly fast. The concept only emerged around two years ago and already it&#8217;s generating incredible results. Whether it will revolutionise PC gaming, well, time will tell. While we wait, you can deep dive into Nvidia&#8217;s research paper on the subject <a href=\"https:\/\/nv-tlabs.github.io\/vqad\/\" target=\"_blank\" rel=\"noopener\">here<\/a>.<\/p>\n<p><!-- \/wp:html --><\/p>\n","protected":false},"author":0,"featured_media":11925,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[20],"tags":[],"_links":{"self":[{"href":"https:\/\/blog.bwgamespot.com\/index.php\/wp-json\/wp\/v2\/posts\/11924"}],"collection":[{"href":"https:\/\/blog.bwgamespot.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/blog.bwgamespot.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"replies":[{"embeddable":true,"href":"https:\/\/blog.bwgamespot.com\/index.php\/wp-json\/wp\/v2\/comments?post=11924"}],"version-history":[{"count":0,"href":"https:\/\/blog.bwgamespot.com\/index.php\/wp-json\/wp\/v2\/posts\/11924\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/blog.bwgamespot.com\/index.php\/wp-json\/wp\/v2\/media\/11925"}],"wp:attachment":[{"href":"https:\/\/blog.bwgamespot.com\/index.php\/wp-json\/wp\/v2\/media?parent=11924"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/blog.bwgamespot.com\/index.php\/wp-json\/wp\/v2\/categories?post=11924"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/blog.bwgamespot.com\/index.php\/wp-json\/wp\/v2\/tags?post=11924"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}