Unreal engine 5

Urist

Well-Known Member
Joined
May 4, 2020
Messages
687
Location
NULL Island
Some insane detail, ambient occlusion is also very expensive to render.
Looks like our systems will be obsolete soon. :cry:
 

SauRoN

Active Member
Joined
May 2, 2020
Messages
493
And it can only be better on the Xbox Series X instead of the VapourStation5.


Sent from my iPhone
 

Urist

Well-Known Member
Joined
May 4, 2020
Messages
687
Location
NULL Island
And it can only be better on the Xbox Series X instead of the VapourStation5.


Sent from my iPhone
Don't know... seems like this was developed specifically for the weird GPU of the ps5, hope that is not the case... just bought a 2070 a few months ago.
 

SauRoN

Active Member
Joined
May 2, 2020
Messages
493
Don't know... seems like this was developed specifically for the weird GPU of the ps5, hope that is not the case... just bought a 2070 a few months ago.

Doubtful.

It’s just marketing spin and wouldn’t be at all surprised if there’s another or even the same demo on the XSX soon enough.

Also the cross-compatibility licensing model of Xbox and Windows games might just see some more friendly inclusion to the engine to make that happen which would be absent from the PS5 optimisation...which could be for the better or worse.


Sent from my iPhone
 

Bryn

Active Member
Joined
May 3, 2020
Messages
121
Location
PE
This is the link you want, without the YouTube compression.

Some interesting discussions on Reddit about Unreal Engine 5. They mention in the video that some scenes have around 25 billion triangles. The current polygon budget for games is around 20 million. And that demo was running on a PS5. No matter how great the PS5 and Xbox Series X are, they are potatoes compared to enterprise-grade rendering farms and $10+k gaming rigs. So the tech is really insane.

Dudes in the VFX industry are saying that the entire industry will likely shift over to real-time video game engines and ditch conventional rendering that requires prohibitively costly GPU farms. One guy even said that some aspects of the tech demo couldn't be done in-house at a major film-effects studio he works at, and would require collaboration with their tech partner to render, be expensive and take ages.

It's not just supposedly the biggest leap for gaming in 20 years, but a massive leap for films and TV shows as well. It could result in big-budget effects for medium-budget TV shows and movies. Apparently much of The Mandalorian was already done in a similar way.

And Epic has made people even happier on top of that: all licensing fees are waived until you reach $1m in revenue and all the fancy online tech they developed for Fortnite has been made available to other devs for free.
 

Urist

Well-Known Member
Joined
May 4, 2020
Messages
687
Location
NULL Island
Used to do vray renderings for architects, the global illumination and photo-realism on that renderer is something else. Rendering a single scene via software could take hours, and one of the biggest culprits was global illumination. I wouldn't even get close to creating a scene that looks like a single frame of this video.
The idea that a gaming engine that is using consumer hardware rendering is now approaching that kind of quality while running at 60+ fps... Poor 3d artists have their work cut out for them.
Unreal indeedy
 
Last edited:

Bryn

Active Member
Joined
May 3, 2020
Messages
121
Location
PE
Used to do vray renderings for architects, the global illumination and photo-realism on that renderer is something else. Rendering a single scene via software could take hours, and one of the biggest culprits was global illumination. I wouldn't even get close to creating a scene that looks like a single frame of this video.
The idea that a gaming engine that is using consumer hardware rendering is now approaching that kind of quality while running at 60+ fps... Poor 3d artists have their work cut out for them.

I think one aspect not being discussed is the positive impact this will have for platforms like Stadia and GeForce Now. The size of game installations is going to increase tremendously. I think both Sony and MS knew that around 1TB was the bare minimum they could provide. The hassle of multiple game disks or very costly storage expansion is going to be a pain in the ass for consoles, whereas streaming platforms don't need storage space, don't consume your bandwidth for downloads and can work on any platform and hardware. Suddenly the value proposition is looking a bit better.
 

Urist

Well-Known Member
Joined
May 4, 2020
Messages
687
Location
NULL Island
I think one aspect not being discussed is the positive impact this will have for platforms like Stadia and GeForce Now. The size of game installations is going to increase tremendously. I think both Sony and MS knew that around 1TB was the bare minimum they could provide. The hassle of multiple game disks or very costly storage expansion is going to be a pain in the ass for consoles, whereas streaming platforms don't need storage space, don't consume your bandwidth for downloads and can work on any platform and hardware. Suddenly the value proposition is looking a bit better.
Didn't google's streaming thing fail miserably? I`d rather have lower gfx than latency, even steam streaming on my own lan is not really cutting it.
 

Bryn

Active Member
Joined
May 3, 2020
Messages
121
Location
PE
Didn't google's streaming thing fail miserably? I`d rather have lower gfx than latency, even steam streaming on my own lan is not really cutting it.

It's still going, so not a failure. But no, it hasn't been especially well-received thus far. GeForce Now doesn't seem to have the same latency issues, so whatever problems Google is having are presumably due to Stadia being a relatively new product.

I'm talking more about game streaming in general and in the future. I think it's pretty obvious that at some point all gaming will be cloud gaming. Without the issue of latency, the benefits are just tremendous.
 

Urist

Well-Known Member
Joined
May 4, 2020
Messages
687
Location
NULL Island
Maybe some form of hybrid technology... with the actors locally in a streamed environment. Don't see how that could work in an FPS though.
Alternatively something like microsoft's humongous flight simulator that streams the data as it might become required.
 

Bryn

Active Member
Joined
May 3, 2020
Messages
121
Location
PE
Maybe some form of hybrid technology... with the actors locally in a streamed environment.

Nah, but to a limited extent, there is already predictive algorithms in use that attempt to minimise the perception of input lag. Google is the king of machine learning, so I would be surprised if Stadia doesn't end up being the king of cloud gaming.
 

wizardofid

Active Member
Joined
May 2, 2020
Messages
372
But yeah... billions of polygons for a single scene... crysis se moses.
Will try and explain a bit later, there is differences between , vertexes, triangles, and polygons.While a scene may contains a lot polygons.Will do a long ass post later tonight, while impressive rendering, there is quite a bit behind the scenes stuff people might not be aware of.Having been in the industry for close to 15 years next year, Especially level designing and polygon modeling, there is a lot more that to it then just throwing polygons at a screen.
 

wizardofid

Active Member
Joined
May 2, 2020
Messages
372
So here goes a long ass post.


While the scene may contain lots of polygons, quite a bit of the informational data has been pre-rendered and baked and the construction and design of the level, this information is kept in data compressed data chunks, which can be converted into raw data for the GPU and CPU to use easily and render on screen. Polygons rendering while it does cause quite a bit of an engine rendering hit, but since dx10 already, you have things like instancing, which was original only used on characters and later expanded on actual levels assets, what it basically means it renders the object and all the passes applicable once at the cost of polygons. Then you have things like LOD (level of detail) not to be confused with reducing rendering distance, but rather reducing texture size and model quality, based on distance from the player camera, any thing not in camera view, is also not being rendered, so while a scene might have lots of polygons, not the entire scene is being rendered all at once, not matter how strong the gaming system is it wouldn't be able to cope. So only what you can see in the camera viewing cone is being rendered and that amounts to a couple of million polygons at a time.

While people have mentioned using the engine to render a movie, while possible there is still severe rendering limitations, especially with things like, water, fire, fluid, cloth ect, quite a bit of this simply can't be done in realtime, it takes quite a bit of computing time to render and calculate fluids and things like fire. Cloth simulation you may see in games, have either been animated, or have limited calculations and functionality. Game engines, while the technical aspects and visuals are awesome, it's biggest weapon will always be, trickier and optimizing resources. You still have limited memory resources, each and every texture in use, uses up GPU memory and exponentially based on the size of the texture in use, quite a bit of it is reused, within a scene to reduce resources. The gaming scene it is physically impossible especially in a busy scene to have every single object with in the level have it's on unique texture.

This content pack on steam, in many instances uses as little as 8 textures total


ss_bcc018c249a86af0bebe4fbcbe104766b39643bc.1920x1080.jpg

ss_fb7249ebed05a1573d5f1610c09f8f539900c545.1920x1080.jpg


While not nearly as impressive as the unreal engine, this is essentially what game development is all about, working with severely limited resources, realtime computing and calculations are not anywhere close enough to be able to do things in realtime. Animation studios, make use of render software like applo, some animation and movie studios from time to time also make use of things like softimage and 3dmax to render specific CGI scenes needed.

To be honest the quality, and technical aspects of using a game engine for a animation film even life action hybrid, simply won't match, what animation studios use currently, The movie shrek took 5 million cpu rendering hours and that is in 2001, at the time they used shader technology that weren't even available in games at that time, and quite a few of them still isn't being used, as it simply to costly, and many instances they needed to create entirely new shaders to deal with various things, as these tech, wasn't around.

So while you could, render a entire movie in unreal 5, it would be no match, in terms of scope, detail and quality. Movie studios render a single frame at a time, they have no need for a constant and playable frame rate.Which means they can throw as much detail as they want in a particular scene, only at the cost of rendering the frame.
 

Bryn

Active Member
Joined
May 3, 2020
Messages
121
Location
PE
@wizardofid

At around 1:45 in the vid:

"Speaking of lighting, all of the lighting in this demo is completely dynamic. With the power of Lumen, that even includes multi-bounce global illumination. No light maps, no baking here."

Also, I don't think the VFX guys on Reddit were talking about making entire movies with UE5 but rather using it for very specific tasks. Much like has already been done with The Mandalorian. They seemed to think the cost and time saving would be tremendous.
 
Top