Redneckerz Posted May 16, 2020 (edited) 13 hours ago, icecoldduke said: Just take a look at my linkedin profile :) It would be great if you could link to that because i do not know your LinkedIn profile. :) Quote Tell me, why are you so vigorously defending UE5? I explain the tech and note its current pitfalls. How's that defending, exactly? Feel free to disprove the made statements instead of making passing commentary on how you shipped UE4 titles (without examples). Quote Truth is, if the new Apature GPU's from NVIDIA are 2x to 4x faster, then Nanite is obsolete anyway, because DXR can handle way more triangles then they are pushing with Nanite. ... You sure you know what you are saying? DXR (the API) and Nanite are inherently not the same thing. Hell PS5 more than likely uses a low level API rather than a Microsoft made one, similar to GNM in PS4. Quote Nanite will be for low end PC's and consoles at best at this point. Unless you have direct insider access with intimate info on how, for instance the streaming tech works you can't possibly deduce that. 13 hours ago, lazygecko said: So the whole thing about just dropping in movie quality source assets... I have serious doubts over this way of doing things being adopted wholesale for entire games. Can you imagine just how fucking huge the file sizes would become if that were done for virtually every asset? We are already seeing tensions right now as the ballooning size of games seems to have outpaced the standards of consumer grade storage space, with games pushing past 100gb starting to become commonplace. I think this would definitely push it way past the breaking point. Instead of second guessing, please consider watching the video's in order to find out how they deal with that problem. Quote And that's not even getting into the other elephant in the room which is the draconian state of bandwidth cap enforcement by the American telecom industry. That's why the demo used procedural textures if i remember correctly. Edited May 17, 2020 by Redneckerz 2 Quote Share this post Link to post
FractalBeast Posted May 16, 2020 Oh shit, throwing LOD in the trash! That's pretty fucking cool. 0 Quote Share this post Link to post
icecoldduke Posted May 16, 2020 (edited) 38 minutes ago, FractalBeast said: Oh shit, throwing LOD in the trash! That's pretty fucking cool. LODS are still going to be a thing, so your statement is very wrong. 42 minutes ago, Redneckerz said: ... You sure you know what you are saying? DXR (the API) and Nanite are inherently not the same thing. Raytracing is the future not rasterization, so why use Nanite when raytracing is the path forward? The only reason games can't go full raytracing yet, and we still need rasterization because the RT cores on RTX GPU's aren't powerful enough yet. With Apature going to potentially be 5nm, 7nm processing bits are all taken up by AMD, and Turing was 14nm, 5nm is going to allow alot of transitors on the GPU die, which means Apature is going to blow Turing out the water. Which means we might be able to ditch rasterization all together, and go full raytracing for some games. All of that means nanite will be for old PC's and consoles, its obsolete before its even released :), since new PC's will just use DXR exclusively, no need for new ways to do rasterization. Quote Hell PS5 more than likely uses a low level API rather than a Microsoft made one, similar to GNM in PS4. That doesn't matter unless your CPU bound. If your GPU bound, even getting closer access to the metal doesn't do anything if the GPU can't keep up with the work your sending it. Edited May 16, 2020 by icecoldduke 0 Quote Share this post Link to post
Redneckerz Posted May 17, 2020 (edited) 13 hours ago, icecoldduke said: LODS are still going to be a thing, so your statement is very wrong. If you watched the video's you would know that its not entirely wrong first and formost, not atleast by the way LOD's are currently defined as.EDIT: I should iterate on this some more because your response is also half true. UE5 automatically manages LOD's for the models in-game, the difference being that artists had to do this manually previously by creating seperate models derived from a normal mapped low-poly model, which was in turn derived from a high-poly model. What UE5 does, is simply scaling the high-quality movie-grade asset based on distance. Because a micropolygon renderer is used, you can automatically add or remove polygons depending on the distance on the fly, which is not something you could without micropolygons or the implementation of the REYES algorithm that they attempt to use in UE5. Quote Raytracing is the future not rasterization, so why use Nanite when raytracing is the path forward? See, its stuff like this where i have to place doubt on what you are saying and what you were stating prior. I have to assume you are mixing things up here unintentionally, you know. Nanite is not the component that deals with lighting, Lumen is. Lumen is (As Alex's puts it) another way of global illumination with multiple bounces and tracing through the micropolygon renderer. Its not a wholesale replacement for raytracing, its rather an alternative for how global illumination currently is done (Which is often baked and using cubemaps). Lumen works in and with screenspace. UE5 will support RTX raytracing as an additional improvement for reflections and shadows alike. Quote The only reason games can't go full raytracing yet, and we still need rasterization because the RT cores on RTX GPU's aren't powerful enough yet. With Apature going to potentially be 5nm, 7nm processing bits are all taken up by AMD, and Turing was 14nm, 5nm is going to allow alot of transitors on the GPU die, which means Apature is going to blow Turing out the water. You mean Ampere. And this has very little to do with what i said since you say this on the basis that Nanite has something to do with raytracing. Quote Which means we might be able to ditch rasterization all together, and go full raytracing for some games. Eventually, down the line, yes. But in the here and now, full-scene raytracing is very compute heavy. Besides, its not like XSX/PS5 were marketed as doing full-scene raytracing in the first place, they simply are marketed as consoles that have hardware raytracing support. Quote All of that means nanite will be for old PC's and consoles, its obsolete before its even released :), since new PC's will just use DXR exclusively, no need for new ways to do rasterization. What all of it means is that you do not know what you are talking about since you think Nanite is the component that deals with lighting when it does not, Lumen is simply an in-between alternative, and UE5 will support RTX raytracing either way (And i reckon an AMD equivalent on consoles will appear aswell). Quote That doesn't matter unless your CPU bound. If your GPU bound, even getting closer access to the metal doesn't do anything if the GPU can't keep up with the work your sending it. Unless you have direct insider information you have no idea what PS5's/XSX's RDNA2 GPU's limitations are. Heck, these things aren't even out yet, so i'd say its close to a 100% that you have no idea whether or not UE5 is GPU bound using these techniques. Especially when this stuff is still 1.5 years away from us. PS: Could you please link to your Linkedin and highlight which UE4 titles you worked on? Edited May 17, 2020 by Redneckerz Iterating on the LOD's part. 2 Quote Share this post Link to post
icecoldduke Posted May 17, 2020 (edited) 5 hours ago, Redneckerz said: Nanite is not the component that deals with lighting, Lumen is. Lumen is (As Alex's puts it) another way of global illumination with multiple bounces and tracing through the micropolygon renderer. Its not a wholesale replacement for raytracing, its rather an alternative for how global illumination currently is done (Which is often baked and using cubemaps). Lumen works in and with screenspace. UE5 will support RTX raytracing as an additional improvement for reflections and shadows alike. You know raytracing isn't just used for lighting right? Before we continue, how much do you actually know about raytracing? Edited May 17, 2020 by icecoldduke 0 Quote Share this post Link to post
Redneckerz Posted May 17, 2020 1 hour ago, icecoldduke said: You know raytracing isn't just used for lighting right? I know. Raytracing is not only applicable to just graphics, but can also exist in audible space, for instance. I don't see how that question affects what you have quoted, though. 1 hour ago, icecoldduke said: Before we continue, how much do you actually know about raytracing? Is there a point in answering this question when: You didn't reference my answers to prior questions asked by you You aren't answering mines PS: Could you please link to your Linkedin and highlight which UE4 titles you worked on? 0 Quote Share this post Link to post
Rampy470 Posted May 17, 2020 Good luck carrying anything even slightly close to that level of fidelity throughout an entire game. If the time required doesn't get ya the budget will. 1 Quote Share this post Link to post
icecoldduke Posted May 17, 2020 (edited) 47 minutes ago, Redneckerz said: I know. Raytracing is not only applicable to just graphics, but can also exist in audible space, for instance. I don't see how that question affects what you have quoted, though. Is there a point in answering this question when: You didn't reference my answers to prior questions asked by you You aren't answering mines PS: Could you please link to your Linkedin and highlight which UE4 titles you worked on? I'm not quoting some of the bits you've said, because I think you've taken a good portion of what I said, and interrupted it incorrectly. I think that's because you don't have a firm grasp of what raytracing is(from a mathematical point of view). So I'm trying to understand what your level of knowledge in computer graphics is. Nanite is only useful as long as we are using rasterization. Once you get into raytracing and DXR, you can vomit lots of triangles at the system as long you have the memory for the BVH tree. You can do that today without Nanite on RTX hardware. You can do that today without Lumen on DXR hardware. Your assumption that Nanite will be for next gen computer graphics on PC's is incorrect, at best it's going to be for next gen graphics on consoles, because higher end PC's might be able to ditch rasterization with next gen nvidia gpu's. Do you understand that concept? My concern with consoles is, how much is this system actually taking up of the frame budget? If they can barely hit 30fps with a single character and this system running, its going to be useless even on consoles. As far as my resume, I'm not going to post it here publicly :). Edited May 17, 2020 by icecoldduke 0 Quote Share this post Link to post
Redneckerz Posted May 17, 2020 12 minutes ago, icecoldduke said: I'm not quoting some of the bits you've said, because I think you've taken a good portion of what I said, and interrupted it incorrectly. Feel free to point out what that was. But ive addressed you in full and all i see you do is: Ignoring my questions Asking random questions in return in order to gauge my ''Knowledge'' which is a bad faith device in intent Demonstrating that you mixed Nanite and Lumen up (I am assuming you didnt do so on purpose) and then failing to address that Stop that. 12 minutes ago, icecoldduke said: I think that's because you don't have a firm grasp of what raytracing is(from a mathematical point of view). Again, feel free to point that out. Ill give you that no, i am not a programmer (And i can imagine that admitting that will cause the appropiate reaction) so no, i do not know the exact formula's because math is not my forte. What i do know is what i have written so far. Perhaps you mean't something else, but nobody is able to foresee what you actually mean't. I am simply addressing what you wrote, not what you might have mean't. With that said, i already answered your question on my level of knowledge but you apparently didn't deem it good enough or you wouldn't be mentioning this further. 12 minutes ago, icecoldduke said: Nanite is only useful as long as we are using rasterization. I haven't said anything otherwise. Its virtual geometry. For rasterization. The fact that UE5 supports RTX should tell you that i am obviously not talking about wholesale raytracing here. Hell, no console or RTX card purely does raytracing as every game out there still relies on rasterization in the first place. You cannot possibly think that you thought i was talking about wholesale raytracing here. Yes, you can push a lot of triangles with RT and DXR - but that's not what Nanite does. But ill give you one bone - The Nanite demo has been found to run at better than PS5 settings on a RTX notebook, or so Epic China claims. 12 minutes ago, icecoldduke said: Your assumption that Nanite will be for next gen computer graphics on PC's is incorrect, at best it's going to be for next gen graphics on consoles, because higher end PC's might be able to ditch rasterization with next gen nvidia gpu's. You are using might, so you don't know if the next generation Nvidia's (Ampere) is going to ditch rasterization altogether. It would definitely be a very very ballsy move on Nvidia's end, but its a theory i don't deem realistic, given that every game released out there relies so heavily on Rasterization still. And fully raytraced, although seeing slow application iteration, is still in its infancy. But the major reason i call that theory unrealistic is in fact, because of next gen consoles. If new Nvidia GPU's ditch how games are rendered completely, how are games made for it gonna work on consoles? Heck, how are Nvidia GPU's gonna run games if the rasterization model is completely removed? Emulate it? Rasterization will stick around for a long long time, so what you are talking about is realistically speaking not the next generation hardware, but the generation after that (at the very least). 12 minutes ago, icecoldduke said: Do you understand that concept? One aspect i do not take kindly about is acting exceptionally smug when you have yet to provide any verification for your earlier claims. 12 minutes ago, icecoldduke said: My concern with consoles is, how much is this system actually taking up of the frame budget? My concern with your question is, why aren't you asking Epic? They are on Twitter, after all! Or ask the Digital Foundry team. There is so much left unspoken by Epic that is not answered yet, which i reckon will answer in the forthcoming months. 12 minutes ago, icecoldduke said: If they can barely hit 30fps with a single character and this system running, its going to be useless even on consoles. Its a tech demo Its in development You should not push your opinion as if its a definitive fact because you aren't an insider 12 minutes ago, icecoldduke said: As far as my resume, I'm not going to post it here publicly :). I have asked thrice now to just link to your Linkedin profile because i don't know what yours is. You said that i should just look it up prior, and that you worked on UE4 titles (without giving examples). If you don't want to link to Linkedin (Although that makes suggesting it to me to look up rather odd) is one thing but this whole dog-and-pony act of making passing claims on your work and when asked simply refusing to do the easy thing (which is either listing the UE4 titles you worked on, or a link to your actual Linkedin) is just annoying. You aren't making yourself more important with those claims, and you certainly aren't making yourself taken seriously by consistently ignoring said request. 1 Quote Share this post Link to post
icecoldduke Posted May 17, 2020 (edited) Nvidia is NOT ditching rasterization, you as the developer might be able to ditch rasterization on higher end PC's on newer NVIDIA GPU's. In which case you don't need Nanite because the next gen nvidia GPU's might be powerful enough to ditch rasterization. That's my point. This would mean you would have two different PC sku's, one for older gen hardware and one for new, but if you want to push movie quality assets in your game, DXR is the correct way forward on PC. I'm simply arguing that Nanite in its current form doesn't offer more then DXR on PC, and I'm not sure what it offers console titles until we see performance metrics. Do we disagree with anything there? Your simply reciting marketing bits, and I'm countering with solid technical arguements. I think we are at a impasse at this point. Edited May 17, 2020 by icecoldduke 0 Quote Share this post Link to post
Redneckerz Posted May 17, 2020 1 hour ago, icecoldduke said: Nvidia is NOT ditching rasterization, you as the developer might be able to ditch rasterization on higher end PC's on newer NVIDIA GPU's. In which case you don't need Nanite because the next gen nvidia GPU's might be powerful enough to ditch rasterization. That's my point. It has no technical basis. You are simply saying the above once more. If that consists as a ''solid technical argument'' in your book then its only your book with your chapters that are written. 1 hour ago, icecoldduke said: DXR is the correct way forward on PC. I thought this was about UE5 on PS5 (You know, the tech demo). 1 hour ago, icecoldduke said: I'm simply arguing that Nanite in its current form doesn't offer more then DXR on PC, and I'm not sure what it offers console titles until we see performance metrics. Do we disagree with anything there? The only thing you are discussing is an argument (and subsequent claims) that you have made up for yourself. 1 hour ago, icecoldduke said: Your simply reciting marketing bits, and I'm countering with solid technical arguements. Always good that if you don't know what you are actually meaning, to blame the other party for providing answers to any questions you asked and to respond to what you have written. I can't predict what you mean, but i can respond to what you write. 1 hour ago, icecoldduke said: I think we are at a impasse at this point. PS: Could you please link to your Linkedin and highlight which UE4 titles you worked on? 0 Quote Share this post Link to post
Kronecker–Capelli Posted May 18, 2020 On 5/16/2020 at 11:38 PM, icecoldduke said: Nanite will be for low end PC's and consoles at best at this point. But tech demo video clearly indicates that it was recorded on PS5 console, so.... PC is still superior, at least at computation power. 0 Quote Share this post Link to post
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.