Edward850 Posted August 18, 2015 VGA said:http://zdoom.org/wiki/CVARs:Display#Video_adapter I never had to use those cvars and commands myself, but since it's relevant ... Not even close to relevant.Maes said:I only owned a Win8 dual-GPU laptop for a short stint, and I recall that at least ZDaemon and GZDoom used the Intel HD, while getting them to use the "good" stuff required some fucking around with a special menu. No fucking needed. You just needed to add an entry for GZDoom in the driver control panel. Since then, GZ now flags both discrete Nvidia amd ATI GPUs, so it's no longer necessary. 0 Quote Share this post Link to post
Maes Posted August 19, 2015 kb1 said:But OpenGL works differently. And with its own set of problems, most definitively, made even worse in systems that don't have dedicated video RAM (like it's probably the case with Intel-anything). Since video RAM (and DMA accesses in general) are uncached, and in shared memory systems AGP/PCIe doesn't mean anything, you can imagine what this means when you have sprite overdraw and it's the card's hardware pulling the same sprite data again and again, and rewriting it back to main RAM. BUS CONTENTION AHOY! ;-) Edward850 said:Not even close to relevant. No fucking needed. You just needed to add an entry for GZDoom in the driver control panel. Since then, GZ now flags both discrete Nvidia amd ATI GPUs, so it's no longer necessary. Well, that was enough fucking for me ;-) 0 Quote Share this post Link to post
Edward850 Posted August 19, 2015 Maes said:Well, that was enough fucking for me ;-) Should probably note that the amount of alleged menu fucking I base that on was NVIDIA and its easy to use control panel. I'd hate to think how it worked with ATI, and thus it's probably for the best that said flags exist now. 0 Quote Share this post Link to post
Panopticon Posted August 19, 2015 Weird... I've been running Sunder using a Celeron and HD Graphics, so far I've had minimal problems. 0 Quote Share this post Link to post
Maes Posted August 20, 2015 Depends on the HD model, and whether you were running OpenGL or not. In software mode, Intel-anything hasn't really got any reason to be crap: it's all plain framebuffering no matter what you're using (and shared RAM might even be at a slight advantage here). But for anything OpenGL/Direct3D (or pretending to be so)...just nope. I wonder what Intel's performance would be if they were coupled with dedicated VRAM...but AFAIK that might not even be possible: they are all designed to work exclusively off shared RAM. The only one that didn't was the ancient Intel 740/Auburn, but even that one stored textures preferentially in main RAM, hoping that the "advances" of the -then new- AGP connector would make video RAM irrelevant. Guess what...it didn't. 0 Quote Share this post Link to post
Guest Posted August 22, 2015 Iris Pro has EDRAM similar to the Xbox one, but I think the driver controls it, the programmer doesn't have access to it. Is it feasible to convert the map to a pre compiled 3D mesh? Quake and newer engines have occlusion in all directions, there are rooms over rooms, but doom maps doesn't have rooms over rooms (excepts for hacks and bridges). 0 Quote Share this post Link to post
Maes Posted August 22, 2015 Naruto9 said:Is it feasible to convert the map to a pre compiled 3D mesh? Yes, and most hardware-accelerated 3D ports actually do so upon loading a map. However when dealing with low-end graphics cards or sprite-heavy maps, all bets are off. 0 Quote Share this post Link to post
Graf Zahl Posted August 22, 2015 Maes said:Yes, and most hardware-accelerated 3D ports actually do so upon loading a map. Name one which does and performs well. The currently fastest GL ports, GlBoom and GZDoom only convert the flats to precompiled data. For walls it's not really feasible because they can change at any time in the game and the maintenance overhead easily nullifies any advantage this might bring. 0 Quote Share this post Link to post
Maes Posted August 23, 2015 Uhhhhhh.... I was under the impression that the various GLNODES lumps (when present) were exactly that, precalculated and complete mesh representations of a Doom map, and that ports with OpenGL/Direct3D support featured built-in node (re)builders just for that. Now, whether existing generators are optimal for the job and/or existing source ports do a good job in using that data, that's another story. But please don't tell me that there are source ports which try to do a per-frame, on-the-fly mesh generation? O_o 0 Quote Share this post Link to post
GooberMan Posted August 23, 2015 GL nodes only define completely closed sub sectors, and additional vertices required to describe them. How a port translates that to an internal format is entirely up to the port author. 0 Quote Share this post Link to post
Graf Zahl Posted August 23, 2015 Maes said:Uhhhhhh.... I was under the impression that the various GLNODES lumps (when present) were exactly that, precalculated and complete mesh representations of a Doom map, and that ports with OpenGL/Direct3D support featured built-in node (re)builders just for that. Now, whether existing generators are optimal for the job and/or existing source ports do a good job in using that data, that's another story. But please don't tell me that there are source ports which try to do a per-frame, on-the-fly mesh generation? O_o There is no such thing as a 'mesh' here. Most GL ports render a map sector by sector, wall by wall. Some put vertex data into a vertex buffer, most don't. While it brings some speed-up, the effects of vertex buffers is relatively minor. The main problem with a 2.5d engine like Doom is that the ratio of vertexes per draw call is extremely low which means that modern graphics hardware cannot really pushed to its limits. As things stand, the most time is spent doing BSP traversal and visibility clipping, not actually rendering the stuff. 0 Quote Share this post Link to post
SavageCorona Posted August 25, 2015 Well if you're expecting high framerates on an Intel integrated chip on Chillax of all fucking wads then I have some news for you, my friend. 0 Quote Share this post Link to post
Guest Posted September 15, 2015 Made a room and placed a player start. In such empty room there is no difference between software or opengl. Then I placed a hundred tourches and other deco in there, opengl is always slower in this case. Large maps are a slideshow, 2-3fps in openengl even if all monsters are removed. 0 Quote Share this post Link to post
Maes Posted September 15, 2015 You can disable the torch and lighting effects, I think. It's those that eat up a lot of GPU horsepower, real quick. If you can live without the pretty colors, go ahead. One cannot expect to run e.g. nuts.wad with full bling and get no slowdown when the sparks start flying (heh, literally). 0 Quote Share this post Link to post
Guest Posted September 15, 2015 There is no way to run opengl at all. Unless the map is as simple as e1m1, entry or vanilla, it is not playable. Any large map is totally unplayable. 5 fps without lighting, 3 fps with lighting. In case of mods with doom 3 monsters and 3D weapons, it drops to 1 fps or less. 0 Quote Share this post Link to post
VGA Posted September 16, 2015 Then use software mode until you get better hardware :-D 0 Quote Share this post Link to post
joe-ilya Posted September 21, 2015 Use Prboom+, that port is so good; it can run planisphere 2 smoothly(with a little mapping patch) 0 Quote Share this post Link to post
blueinferno776 Posted September 22, 2015 Prboom+ handles actors much faster than gzdoom in both of its renderers. Zdoom should run very well on your intel part. 0 Quote Share this post Link to post
rampancy Posted September 28, 2015 Naruto9 said:There is no way to run opengl at all. Unless the map is as simple as e1m1, entry or vanilla, it is not playable. Any large map is totally unplayable. 5 fps without lighting, 3 fps with lighting. In case of mods with doom 3 monsters and 3D weapons, it drops to 1 fps or less. time to buy a video card. id say get a 960 gtx. the again ive got an old 470 and it played doom fine. but youre going to have to spend some money like every one else. on the bright side, most the stuff that people go to opengl for is possibly detrimental to the game. trilinear filtering just blurs the shit out of low res textures and high resolution just makes the game look dated by showing you stuff you could never see at 320x200. having said that i usually play gzdoom with all the filtering off at 1080 since thats my native res. someday someone will add a feature to simulate 320x200 at 1080 :) gzdoom is awesome. edit: i just wanted to clarify. im not being a smart ass i really would like to see the 320x200 simulator that would be the most awesome thing. 0 Quote Share this post Link to post
Linguica Posted September 28, 2015 Will GZDoom not play in 320x200? I guess that's kinda low for a modern engine. 0 Quote Share this post Link to post
Graf Zahl Posted September 28, 2015 GZDoom can play in 320x200, but modern graphics drivers do not have such a video mode avilable anymore. It also never tries to draw to an offscreen frame buffer and then render that to the screen. It wouldn't make much sense anyway because for hardware having problems with pixel fill rate on a Doom engine it'd not result in a speedup but more likely in a slowdown. 0 Quote Share this post Link to post
rampancy Posted September 28, 2015 i was just trying to think of a way to have the game look like 320x200 while actually running at my native 1080, bc normal 320 is very blurry and shit looking with all the scaling and stretching the monitor does. maybe every one pixel would draw as multiple identical neighboring pixels or something? do you see what im getting at here? im terribly unclear sometimes. id just like to be able to view the game as the artists originally intended it to be, while hanging on to all the cool tricks gzdoom brings to the table (32-bit color, great audio options, perspective correct mlook, xinput, wad compatibility, etc). 0 Quote Share this post Link to post
rampancy Posted September 28, 2015 Gez said:Chocolate Doom does that. youre right, thanks for the heads up! that is what im talking about. my problem is that ive used gzdoom predominantly for years and change is hard lol. and i feel like id be losing some things to gain others. 0 Quote Share this post Link to post
VGA Posted September 29, 2015 Try the lowest aspect-correct resolution for your monitor and disable all filtering. 0 Quote Share this post Link to post
rampancy Posted September 29, 2015 im pretty sure i have but i will check it all out again. iirc i have a choice between a tiny image in the middle of the screen and nasty looking scaling. 0 Quote Share this post Link to post
david_a Posted September 29, 2015 Maes said:I wonder what Intel's performance would be if they were coupled with dedicated VRAM...but AFAIK that might not even be possible: they are all designed to work exclusively off shared RAM. The only one that didn't was the ancient Intel 740/Auburn, but even that one stored textures preferentially in main RAM, hoping that the "advances" of the -then new- AGP connector would make video RAM irrelevant. Guess what...it didn't. There is a Broadwell variant that has 128MB of EDRAM on the CPU intended to help out. I think it provided a nice graphics boost, but I haven't seen a benchmark. When you weren't using the embedded GPU it would act as 128MB L4 cache which apparently provided an insane speedup - that CPU still beats the latest Skylake in CPU-bound games. People were annoyed with Intel for having no intentions of making a top-of-the-line Skylake equivalent. The main issue hurting the Intel GPUs is that they have a very limited power/transistor budget compared to discrete GPUs. If you scaled the Intel design up to the same nVidia/AMD-level gigantic power-sucking beasts coupled with gigabytes of super-fast RAM they would probably be extremely competitive. I kinda wonder how good their drivers are (especially the OpenGL ones) but I have a feeling they're better than the AMD ones. I had to yank an AMD card from my new work PC because their POS drivers wouldn't render desktop applications correctly. 0 Quote Share this post Link to post
Graf Zahl Posted September 29, 2015 From experience I can tell that Intel's GL drivers have their own share of issues. While working on GZDoom I encountered some really, really odd things with the GLSL compiler which just refused to compile proper shaders. 0 Quote Share this post Link to post
david_a Posted September 30, 2015 For better or worse most of their effort probably goes into the Direct3D drivers. I'm fairly pleased with how well the HD4000 handles things but GZDoom is probably the most modern thing I've thrown at it. 0 Quote Share this post Link to post
rampancy Posted October 1, 2015 david_a said:For better or worse most of their effort probably goes into the Direct3D drivers. I'm fairly pleased with how well the HD4000 handles things but GZDoom is probably the most modern thing I've thrown at it. well with consoles all being powered by amd, and games being designed around consoles, it would make sense that direct3d is going to be the target in the foreseeable future, right? although doesnt amd have their own api now? mantle? or am i way off base. 0 Quote Share this post Link to post
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.