Jump to content

Graf Zahl

Members
  • Posts

    12973
  • Joined

  • Last visited

Everything posted by Graf Zahl

  1. Back in the day I had contact to some people working in the industry, thanks to my very first commercial projects, and it wasn't uncommon that these people outright dismissed C as a viable language for game development. This was before Doom, of course - by then they had realized that C is the future. For Windows 3.1 we were using WinG back then, yes one of those half-assed attempts to get better performance, to get the images to the screen. Windows 95 was indeed a quantum leap, but it still took several years for games to be developed on Windows - and even in 1998 there were still several games made for DOS. Regarding C, yes, the may have gotten the message, but when looking at some code bases of that time it is abundantly clear that some game programmers did not adjust their coding style. Lots of games may have been C, but they look like someone did assembly style programming in it. I don't know how these worked, but for Itanium the problem was that the compiler was responsible for explicitly aligning the instructions to parallelize. This turned out to be a nightmare for compiler developers so that in the end the potential power could never be used. Ultimately it turned out to be a lot more effective to let the hardware deal with the problem than the software. The concept for this type of CPU was developed when hardware was a lot less advanced so it looked like a good idea. Too bad that some people stuck to it until the bitter end even when its weaknesses became evident.
  2. ... and that was primarily owed to the fact that back in the day the game developers had to relearn their entire trade. They went from gaming-centric home computers to full-fledged PCs at the same time they had to transition from assembly to C/C++ and from direct bare metal hardware access to modern OSs that abstract away all the grittiness of the hardware behind far more polished APIs. Windows meant a total paradigm change on virtually all accounts for them and as it often happens with people highly entrenched into a certain way of doing things, the initial reaction was to reject the change and dismiss the platform - so we still got DOS titles for years to come and even with Windows 95 it took a long time to migrate. And by the time the developers were ready, Windows 3.1 was long gone. I, on the other hand, worked in a multimedia company back then that already focussed on Windows development, so it was more natural to do our game in Windows, too, and also to focus on simpler, cheaper and less risky titles.
  3. The quality issue was mostly due to "real game developers" not using Windows 3.1, not Windows itself. It was possible to do something decent on that platform, but back in the day, most game developers were still hacking away on the Amiga with assembly language, but Windows was meant for high level language development so most of those game programmers wouldn't have anything to do with such a "lowly" platform. For trivia's sake, here's a video of a Windows 3.1 game I was the lead developer on, although this was released in 1994 initially, so maybe a bit late in Win 3.1's life cycle. https://www.youtube.com/watch?v=cXVl_GBR36g Yes, that's 16 bit Windows, written in C++ (not C!) - but we had a great graphics artist working on that title.
  4. Itanium never succeeded anywhere. The only reason it got some moderate market share on workstations is because HP had a huge stake in it and forcefully pushed it. In that regard it is a lot like Windows Phone. It was clear from the outset that it would fail, yet some party with a vested interest to make it succeed tried hard but ultimately got nowhere with it. The last 10 years had mainly been the inevitable wind-down of a failed business that could not just be terminated without risking some very pissed off customers. The writing had been on the wall many, many years ago that Itanium was a design failure. Everybody realized that, except HP. The entire idea behind it, i.e. moving significant parts of the code execution logic into the compilers was just unviable and shut the door for all kinds of hardware based optimizations completely. You'd have had to recompile everything to take advantage of better CPUs and to serve everything well you'd have had to provide multiple binaries for different CPU generations to take full advantage of them. I am sure that without HP's involvement Intel would have shut it down a lot earlier (i.e. long before bringing it to the market.) That tying to the clock speed was more a thing of the 80's, when games were written in assembly and virtually everything was an 8086 with 4.77 MHz. By the 90's this already had become a totally unusable technique because we had been fully in the clock speed race by then and you couldn't take clock speed for granted anymore - the 80386 came out in 1985 and by the beginning of the 90's was already becoming the standard CPU for PCs. Far more common was tying the game to the video refresh rate because that was a lot more reliable than CPU speed. The main problem back then was that virtually all experienced programmers came from a world where they had to write their code in assembly and now had to relearn C - but often used it just like they used assembly code, with poor data organization, inefficient function design and a tendency to pass data between functions through global variables. Modern code design guidelines did not exist yet and code quality was overall a lot messier than today. You can see some of this even in Doom, and even more in the Build games.
  5. What Windows 3.1 games? There aren't really that many - before Windows 95 the vast majority of games was made for DOS and after that it took years for Windows to finally displace it. The Windows 3.1 titles I can remember are mainly cheap titles aimed at casual users.
  6. There's no such thing as ghosts. Repeat: There's no such thing as ghosts!
  7. 3D floors by themselves are not an issue - each single 3D floor costs roughly as much as the base sector it is in. The number of sectors is, so 3D terrain will lead to the same effects, and portals have their own performance bottlenecks. In all cases, it's the number of element that counts.
  8. The Code was also in Quake, which you can see if you follow the link. But as it is just a small self-contained utility function for the AI pathfinding it doesn't negate anything that was said. The actors in these games are still totally incompatible objects.
  9. BSP was never used for hitscan tracing. The only non-rendering tasks the BSP gets used for is a) finding out what sector something is in and b) sight checks. And the latter was only done after id was unable to fix their far more efficient blockmap based algorithm. None. Whatever you believe, Doom's and Quake's engines are so fundamentally different on every conceivable level that the only way to make "combined" engine would be to put both full sets of code into the same executable. There's simply not a single piece of game-side code that can be shared. Furthermore, Quake (1+2) are strict client/server based, so even seemingly trivial stuff gets send across the network, even in single player games. You cannot even combine Doom with Build because, despite both being based on a 2D map format, there's very little to share - although in this case you could at least have a unified UI that serves both, if it wasn't for the incompatible licenses. '
  10. Why would you rip out the BSP? For what it does there's not really much out there that can do it faster. The real problems come with map sizes that cannot even be done on modern engines - there's reasons why they do much with models that needs to be done with linedefs and sectors in Doom.
  11. Either bridge things or invisible 3D floors, whatever suits best, sometimes you may even use invisible 3D midtextures. Of course when doing a simple sector bridge, a 3D floor will still be the best way to go, but once you have to split stuff into a large number of small sectors it may make sense to use models with some other means to block their space.
  12. It's just that you cannot compile game scripts directly to native code because then you are platform locked. Quake 2 went that route with its game DLLs and now we face the problem that many old mods come with a DLL and no source so they are dead in the water on anything but Windows with strongly backwards compatible ports. Quake C compiles to some bytecode, just like ZScript does internally. Whether that bytecode is interpreted or further processed by a JIT compiler is a completely different matter and may even depend on the port or the hardware platform being used.
  13. Not only that - they also restrict interaction with the game environment quite a bit. ZScript can treat a lot of data that is built on engine startup like static constants. Any precompiled language could not do that.
  14. Whatever suits you, but it'd be a lot of help knowing up front that this might extend the working set of data.
  15. If you want you can check out the test build I posted for benchmarking here: That already contains the new backend.
  16. I think that system will be better served with the upcoming GLES2 backend due to be released in GZDoom 4.7.0. So far the numbers we got show that on virtually everything that can run OpenGL 4.x it performs better or at least as well as LZDoom if that has a performance advantage over GZDoom's full GL backend.
  17. Seconding what Ketmar said. What you run into here is simply the limitations of the engine, there's only a given distance you can go with a Doom-style BSP and an inherently non-multithreadable play code. The only real problem is performance with thousands of monsters, i.e. stuff like NUTS or ultra-slaughtermaps where the added complexity of the engine just adds up. The biggest performance issue is actually CPU cache performance. So trying to improve code to make things go faster will then immediately be nullified by the next cache stall. That's very much it. Virtually all graphics hardware from the last 15 years has been designed for modern design techniques, which mostly means rendering large meshes with very few draw calls. Doom is quite the opposite. Here you render very coarse meshes with a high number of draw calls. This gives an utterly shitty performance ratio, and if the driver adds its own bottlenecks it won't take long until performance tanks. All that said, a modern low spec computer with a decent CPU (doesn't need to be high end but shouldn't also be bottom of the barrel entry level stuff!) will still be able to display most levels at +100 fps easily. On my brother's recent €500 laptop even a map like Frozen Time, which is the worst performance killer I know, still runs at 46 fps on its integrated chipset.
  18. Now we are genuinely taking things too far. :P
  19. No. It's just one of the sectors around the exit switch. What makes you think it's joined?
  20. They didn't use it because their editor wasn't capable of creating non-continuous sectors. Remember: the editor was purely linedef based and used a textual map format that had to be compiled into the final output. Sectors were only generated in this step without any chance for the mapper to link two parts together.
  21. By far the biggest issue with XP today is that modern toolchains are losing support for targeting it - and even if they still can, the result may be compromised on more modern versions of Windows, meaning that many developers won't bother anymore. So it not only gets harder to get stuff compiled for XP, but also to get the needed dependencies in a form that's compatible with XP. And it is mainly the latter that makes this job hard even for those who still want to support it.
  22. Unless you still have a true GL 2 card you'd be precisely the kind person whose numbers I'd like to see. The GLES backend is supposed to replace LZDoom on the hardware it can run on.
  23. Thanks. That's precisely the kind of hardware I'd like to see results for - where the GLES backend shows a clear performance advantage and we can do a reasonable check how it stacks up to LZDoom. BTW, what's up with Darkest Hour and GZDoom? Those checks are missing. Did it crash or what?
  24. We already got results for such systems so it's probably not worth going through all the hassle of running the tests manually on Linux. It's really just those old cards I listed in my first post where we haven't seen any results for yet.
  25. No. You should post the generated 'benchmarks.zip' file.
×
×
  • Create New...