Look at 6:45-55 part where she drops in the ball of light. Suddenly the demo turns into a current/last gen game. It's like someone turned off the switch labeled "magic". The colors are off, the lighting is totally wrong, it feels like any other game. Yeah, that's the maximum "realism" you can expect to see in real games. It is still pretty nice, but far from what this promises...
The much touted dynamic GI feels faked, because it is. It is not real raytraced GI. It is slow to react, if it reacts at all to light changes, and there is no real color bleed (something that CiG is working on!)
Shadows remain in place, way too strong and sharp, regardless of the lights position. That is not how actual GI would work... I'm guessing a smart use of occlusion maps trickery (very cheap to calculate real time compared to any kind of actual GI). Thanks to the Nanite system, it should be easy to pull off as you have all the geometry data from the displacement maps... Yeah, multiple billions of plygons drawn... lol. That's dynamic displacement mapped when close, turned into much cheaper normal mapping when far away with low polycounts. Great idea, been in use for ages in film VFX, nice to see it implemented in real time in a dynamic system. Well done there. That is the Only next-gen leap.
I wonder how large that demo is in Gigabytes. The billions upon billions of scanned polygons doesn't necessarily equate to a lot of data, unless textured... But everything is textured so :D I'm guessing at least 30 gigs for that small bit of demo, maybe more.
So, my critique of this hyper super duper dynamic automated fake GI is pretty simple: can you change the fakery to suit your needs? Remember how much time it takes for CiG to get a space station or moonbase or whatever lit the way they want it to. Using a system that does something in a certain physically correct way may not allow for mood lighting you are trying to achieve! Grim Hex is getting re-lit with like every 2nd patch, it's freakin ridiculous at this point, but it shows how much work it is to get things the way you want them even when you are using your own engine! Now imagine if you have to use a 3rd party "realistic" lighting system like in this demo. Usually when something "just works", it means you have less control over how it works. One more reason I wanna try this for myself, see how flexible this magic is.
See how when they turn off the GI in the beginning they say "look, there is no light, everythings black!". That's pretty telling of the limitations of the hardware and engine combo. You never see more than 2 real dynamic lights on screen! That is probably for the reason that if you up the light count, it crawls to a halt with that many high quality assetts. This has been the issue of every single game engine since time immemorial. There is nothing new about dynamic lights, we had those for ages. The question has always been how many lights can you get away with before your PC explodes!
Another problem is the working with scanned assets. First of all, they are a huge drain on your PC. While the engine may handle them fast, the software you're using to alter and shape them to your liking will Not!
It is very neat when you can scan and use the piece as it is, but how many scans of demon lords, or alien fighter jets are out there?
Yeah... The problem is you don't make the design to fit the assets, you make the assets to fit the design. In this case, it worked well enough because there wasn't much shown apart from some real Earth enviroment and some statues that were handmade in ZBrush. Using full assets that gives you the geometry as well is awesome, but having to texture something like a Revenant from Doom like this, using only scanned materials, is impossible. You will never get the results you want, because it doesn't exit in real life lol. So you might as well do it like you did for the last 5 years.
Substance Designer and Painter have been out for ages, used in VFX, and in Star Citizen as well. Search for the episode where they make spaghetti and other foods. That's the Substance suit they are using. It provides all the flexibility you need, it's very easy to use, it uses "smart" PBR materials, fully customizable/programmable shaders, you don't need to be stuck with scanned Quixel's assets.
As I'm writing this I'm making an asset. It's a simple fountain found in the courtyard of a castle. It's made of concrete (or stone, looks the same), nothing really fancy in it's design. I have already spent half a day modeling the thing to look like the real one, and will spend another half day painting it up with Substance, because I'm gonna need at least 3-4 materials mixed up to get it right where the water was discoloring it and moss grew and shit. I checked the Quixel Megascans library if I could use something from there. Well, they have lot's of materials but there's already sites for that which are cheaper to use. As for full 3d assets, nope, nothing even remotely close. And it's not something special or outlandish as ad alien fish, it's just a simple fountain lol...
Btw, notice how there are no objects you can see through, apart from the terrible water. It looks like it's from Morrowind ffs. If you want to see the best water ever, check out AC:Origins (the Egypt one).
I wonder how their faked GI reacts to glass or bottle. Probably not well enough to showcase.
All in all, it's a very smart demo, built perfectly to showcase something that will take ages to get right for actual game devs, just like with every console iteration, especially with the very limited hardware of the "next" gen consoles which are already half a gen behind.
I do hope to see games that look this good, but I'd think it'll take many years still to actually get there. Maybe with the next "next-gen".