WingedKagouti on 14/5/2020 at 14:20
Quote Posted by icemann
So if the average decent AAA production game present day takes 3 - 5+ years to make, games under this new engine will take 10?
They're aiming for the opposite, a lot of the time spent in development of those large titles is spent making
and optimizing the various graphics. If UE5 makes a lot of the currently (semi-)manual graphics optimization redundant, there's the potential for a noticable amount of time to be saved.
SubJeff on 14/5/2020 at 15:14
Does look good. Feels like an ad for the PS5, which I'm probably going to get anyway.
Judith on 14/5/2020 at 17:13
Quote Posted by WingedKagouti
They're aiming for the opposite, a lot of the time spent in development of those large titles is spent making
and optimizing the various graphics. If UE5 makes a lot of the currently (semi-)manual graphics optimization redundant, there's the potential for a noticable amount of time to be saved.
It seems like people sculpting stuff in Zbrush and making megascans would be able to skip the optimisation phase, I wonder how will it bump memory and disk space / download requirements for games. We already have >50 gig games, which is a nuisance to download on average connection.
The realtime GI / "infinite bounces" with Lumen is exciting though. No more hours wasted on baking lightmaps (which can take hours for a fairly finished level). Lighting artists will have a blast if it's as good as it looks like.
Renzatic on 14/5/2020 at 17:40
Quote Posted by Judith
It seems like people sculpting stuff in Zbrush and making megascans would be able to skip the optimisation phase, I wonder how will it bump memory and disk space / download requirements for games. We already have >50 gig games, which is a nuisance to download on average connection.
Here's a question for you: if UE5 is able to use high res models straight out of Zbrush en masse, would you even need normal and heightmaps anymore?
Judith on 14/5/2020 at 18:37
If you wouldn't need normalmaps, that could lower the disk space cost, provided the high res models wouldn't take much more. I'd have to make a comparison, but something tells me that e.g. a million-polygon .fbx model needs much more space than a 4k normalmap.
trefoilknot on 14/5/2020 at 18:52
Karras, hm? I don't trust it...
Renzatic on 14/5/2020 at 18:57
Quote Posted by Judith
If you wouldn't need normalmaps, that could lower the disk space cost, provided the high res models wouldn't take much more. I'd have to make a comparison, but something tells me that e.g. a million-polygon .fbx model needs much more space than a 4k normalmap.
Right. You'd only need diffuse and spec maps, the former of which can be compressed somewhat. Without high definition normals, heightmap (does UE4 use 16BPP for those?), and multiple LOD models, you'd probably be looking at a slight surplus in relative asset size.
demagogue on 15/5/2020 at 02:05
The guy in the video said directly there's no need for a normal map and LOD models.
I could probably look this up, but I'd imagine the file size of a model is proportional to the number of vertexes, so that a million poly model shouldn't be smaller than 3MB or whatever.
If you watch distant geometry in some parts of that video, you'll see it morphing, which look like dynamic LOD, which I guess it would have to be if it's allowing an unbounded line of sight.
He also said something about specular when the flashlight was reflecting off the ceramics, which makes it sound like it's more than just a vanilla specular map, but I didn't really catch the significance. I don't think it was procedurally computed specular and diffuse, but who knows what they're capable of anymore.
bob_doe_nz on 15/5/2020 at 03:15
Well looks like the System Shock remake is going to be pushed back to 2022.
Renzatic on 15/5/2020 at 03:28
I'm just wondering what the real bottleneck is for this. We're all making it sound like it's just WHATEVER DUDE! FUCK THE POLYGONS! JUST GO FOR IT! But comeon, there has to be some limitation here.
Even with some ultra snazzy tessellation at work, nigh unlimited polygon budgets seems too good to be true.
Quote:
I could probably look this up, but I'd imagine the file size of a model is proportional to the number of vertexes, so that a million poly model shouldn't be smaller than 3MB or whatever.
They can get surprisingly large. I'll try and do an experiment right fast, see how big I can make a mesh with my piddly machine, and look at the .fbx filesize.
edit: A 3.14 million tri sphere with a smiley face sculpted on it gave me an 84 meg .fbx file. I was also surprised at how well Blender handled a polycount that high.