Jump to content

Graf Zahl

Members
  • Posts

    12973
  • Joined

  • Last visited

Everything posted by Graf Zahl

  1. Seriously, if you want to use recently compiled software, XP or lower is just a ticking time bomb. Some developers may still care but most don't as there is virtually no market share left. So even if you got a port that still cares, don't forget that the libraries it depends on may stop caring eventually if it becomes too much of a hassle. If this was solely for playing vintage 90's Windows games, I'd probably go for Windows 98 SE as that's the best version of the platform these games were initially made for and the one least likely to cause problems.
  2. Can that nonsense at least be switched off? The first thing I did on Windows 7 was to disable all those fugly translucency effects that didn't improve anything and only got in the way.
  3. You mean the original software renderer? Of course it starts lagging at higher resolutions because there's a lot more pixels to be calculated - and it all needs to be done on the CPU. This can quickly result in slowdowns. The GPU won't do much of a difference here. The only remedy would be a faster CPU - or use a hardware accelerated source port. Sorry to disappoint you.
  4. Wow, that's a strong load of bullshit. You are running into a very specific performance case with your port that hints at an architectural issue and generalize from that that 64 bit is just a useless hype? The truth is, you cannot fight innovation and technical evolution. The advantages 32 bit has in some very isolated use cases are very minor compared to being able to use more memory. Yes, everybody is using it. Why should "everybody" stick to outdated technologies that have fallen out of favor more than 10 years ago? By the same reasoning, should we still use Windows 95? XP was a "forced upgrade", so was Windows 7, 10 and now 11. That's your problem. What you need to realize is that most people do not think like that so whatever seems right to you makes working with your port a genuine hassle. Let's make this clear: You are doing great work on k8Vavoom, but this constant "my way or the highway" attitude you show here is actually hurting your project and will surely do some long term damage to it if it does not change. Most of your users do not use the same hardware or software as you prefer - they use more modern things -, so you will inevitably be developing into a dead end if you continue on that path. Which would be a shame. Well, the inevitable outcome here will be that eventually you won't be able to compile more recent source code anymore, if that chooses to embrace more modern C++ standards. That's all very much irrelevant. All current operating systems are 64 bit, macOS has already dropped all 32 bit support some time ago, in Linux it increasingly becomes a hassle as the needed libraries are no longer included and with Windows 11 it has also started the final part of the 32 bit phase-out. And mobile platforms have already ditched it entirely. So even if you still got platform support, it is only a question of time until that gets reduced to token support for running older software through an emulation layer with some performance impact. Here's where your error in thinking lies: People do not "jump blindly onto a bandwagon". It's just that most software tends to standardize around up-to-date hardware technologies. Current PCs are equipped with 8, 16 or 32 GB of RAM, not with 4. They want to provide the ability for *single applications* to access that RAM if needed. They also need memory windows for other stuff, like memory mapped files, CPU visible GPU memory and so on. A 32 bit system is poorly equipped to do these things. Especially when doing memory mapped access file the 2/3GB barrier can very quickly become a crippling factor. Been there, done that, no fun working around the limitations. With 64 bit you can just map the file and be done. Of course the CPU performance issues can also not be discounted. It is not just the increased number of registers, but also being able to do 64 bit arithmetics with far, far less overhead. But in the end it comes down to the OS developers. What should they do? Performing endless double checking that everything they do works on both 32 and 64 bit? Or initiate an ordered transition to the more modern and more future proof standard? They clearly would prefer a single, universal standard here, not two. Also for installing an OS on a PC to be sold, there is no realistic way to have two options - they are virtually forced to install the modern, more forward-looking one. So, ultimately the same happened as in the early 90's: When 32 bit CPUs gained market dominance back then, it quickly pushed all 16 bit stuff out of the market with 16 bit merely becoming a fallback option in the OS that over time also disappeared. The same happened with 32 to 64 bit, but quite unsurprisingly it took a while longer because 32 bit was still "good enough" for many tasks. But it couldn't stop the gradual erosion of the market and will eventually lead to its demise, and it doesn't care one bit if you like it or not.
  5. No, not really. With all these things you must never forget that a lot of the analysis to do this stuff required the source to be available. It also took over 20 years to get there and the game not falling into relative obscurity. So? It wasn't the first mod to use Dehacked to alter the monsters' and weapons' behavior.
  6. 'I missed this part in my last post, but well... The currently running GZDoom survey reports a mere 1% of users on a genuine 32 bit system, down from 1.5% two years ago. So yes, it is a tiny, shrinking minority. The number of 32 bit Linux users is 1 - that's one user, not one percent! Not as tiny as XP when it had to be dropped (which was 0.15%) but still well below the threshold where I still care. This is why my focus is solely on 64 bit. The 32 bit build for 4.7.0 was merely done to give the low end users a one-time chance to use the new GLES backend You know how it is: Many users of outdated technology do not realize how far behind the curve they often are! We get the same now with Windows 11. Most assumptions about incompatible systems totally overestimate their numbers. With 32 bit the big question will be when support in forward moving OSs starts to suffer. I don't expect it to go away any time soon but I wouldn't discount the possibility of performance optimizations of 64 bit mode in the CPUs at the cost of lower 32 bit performance. Just imagine something like Intel's upcoming CPUs with performance cores and economy cores, but only the economy cores still supporting 32 bit, so that the performance cores can focus on modern software written for them and be slimmed down a bit. In a way such a thing would make sense, considering that most 32 bit software is old and can make do with lowered performance.
  7. Let's not delude ourselves: What makes Doom still go as strongly as it does is that over the years there always have been ports that allowed to play the game on modern systems at native screen resolutions. This particular thing is crucial to attract new users. Without it the user base would have steadily shrunk to the hard core of fans who may have been able to keep the game alive at a far lower level for several more years. But even among those eventually the time will come to move on, resulting in gradually thinning out the community. So we'd have a game, playable at 320x200 (ignoring Doom95 which was bad even as it was new) in DOS emulators. There'd be no Brutal Doom to lure in younger people - all you'd had is a comparatively simple game, both technically and visually, that'd be existing in some tiny niche somewhere on the internet where the common public does not even look. Of course one big factor here is whether it would still have had the momentum to spawn Doom 3. If yes, things may have been different, but if not, it'd probably be dead by now.
  8. The issue of floats vs. doubles is not really value range but precision. A float has less fractional precision than a 16.16 fixed point number and this can indeed cause occasional problems. The main reason why GZDoom used doubles is something else, though. At the time the conversion was done we still thought that x87 code still mattered and x87 does not have any sane implementation for multi-precision value types, so to ensure consistent behavior the compilers have to write out/read back the numbers constantly. Single precision floats on x87 with VS's precise mode are a heavy performance drag that far exclipsed the effect of the CPU cache. With SSE2 math this entire consideration is mostly irrelevant - but when I shortly afterward changed all doubles in AActor and sector_t to floats, the performance difference was not measurable so I left it as doubles for the better precision. Well, in GZDoom the 32 bit VM interpreter is considerably slower than the 64 bit version, because it has to do a lot more register spilling to the stack. But it does not use any such pointer tricks at all, the compiler tries very hard to get as many lookups as possible resolved in the compilation path (especially class descriptors and sound indices) and outputs a byte code that's very CPU-like and low level. But even so, have you actually profiled this case or just made an educated guess from your use case? I'm asking because I never experienced such a thing - in all cases I checked the lack of registers in 32 bit mode easily made a bigger difference.
  9. The impact from double pointer sizes is mostly irrelevant. All code in GZDoom I profiled is faster in 64 bit than in 32 bit, with the most signficant difference in the software renderer's drawers. The added registers allow significantly better optimization of code and that does improve performance. You are simply getting hung up on a non-issue here. For me, in GZDoom it didn't even matter if I used floats or doubles for all values in the main structs - performance difference was zero. AActor alone has 64 floating point member values, making those floats would shrink the size by 1/5th.
  10. Yes, you will, but you also got the benefit of more thorough code checks That's where automated CI would come in to save you from the trouble. Of course it is. 32 bit is on the way out as a development platform. 64 bit is the future. And especially with x86 the performance hit of having less registers is very noticable in most code that is performance sensitive.
  11. So is "32 bit x86 Linux" as the only officially supported platform. Which means that "support" is restricted to obsolete platforms. :P This is precisely the platform all computing will move *away* from! In all seriousness, I can understand why doing this properly can be cumbersome and annoying, but in the end it is clearly worth it because the 3 "big" compilers vary sufficiently in the emitted error messages, that the combination of all 3 is far more likely to catch a programming mistake than relying on a single one, and then on an outdated architecture. Also, these days there's more than enough options to get CI virtually for free with either Github's integrated workflow or Travis/AppVeyor, so in the end it's a one-time investment that clearly pays off.
  12. That sentiment may go both ways. Any potential contributor may choose the project they choose to work with by how easy it is to work with its maintainer and trust them not to screw things up for their contribution. That by definition means that cross-platform awareness is part of the package in most cases. Well, tests are mostly snake oil anyway. Especially with complex programs like a game engine the possibility of inputs is endless, so how to set up these tests? I don't think that much more than "run some demos, see if they desync" can be done - and that's only an option on conservative ports. At my day job there's a high reliance on both automated unit and regression testing, but the outcome is always the same: All tests pass because all they test is existing code and features - anything new will first require new tests to be set up, and once they pass once it's rare they break again. A lot of work is invested here - the result is rather meager, though.
  13. Of course not, because you make them run away right before they may want to consider it. It's all fine and well that you prefer to test on your own system, but the ultimate question is what you want to achieve. I can outright tell you that most Windows or Mac developers will take a hike if they have to put up with a project management that shows no interest in wide compiler/platform support. Your hosting choice is also something that doesn't attract potential contributors - if you want help you got to host where the people are, and that is Github and Gitlab, but I think we had this discussion before.
  14. Best thing to avoid this is to set up CI builds for the platforms you won't test yourself. Then you get immediate feedback when something went wrong and can rest assured it will compile on all supported platforms. We do not really need another port like Doom Legacy which went GCC only a long time ago and dug itself into that hole ever deeper over time so now it is virtually impossible to compile with anything else. This is something that had me scratch my head repeatedly. For any normal person "sane compiler mode" should be the default choice - and signed int overflows in particular are so frequently exploited in x86/ARM code that going a different route seems very foolhardy, but seemingly those compiler developers often live in their own world where minuscule performance gains are more important than robust code. For OpenAL you should use OAL Soft on macOS - that one works as one would expect. Their system OAL is an utterly worthless piece of trash.
  15. Too bad for you, but to be blunt, your entire argument fails to consider what a vaccine actually is. No, it's not some chemical substance where prolonged exposure can have poisonous side effects. Yes, there are a few cases where there's a negative reaction to it, but in the grander scheme of things it's minor to the number of deaths COVID causes. But since we're talking about freedom of opinion, let me tell you that in my opinion it's people like you who drag out the entire pandemic for far longer than necessary, so forgive me for not feeling sympathetic to your cause. The harm you indirectly cause is far, far greater than the vague fears you voiced. Last but not least, I am convinced that the medication I get for my tennis elbow will most likely do more harm than those vaccination shots I got - I'll take it anyway because without it I'd be in constant pain.
  16. The big problem here, of course, is, that some elements on the right wing side persistently ignore all scientific advice and instead babble a lot of dangerous bullshit about COVID. While in most countries these are fringe parties that do not have a good standing with the vast majority of the population, in the USA they are part of one of the main parties which makes the entire matter highly alarming.
  17. Don't bet on it. A video site requires a lot of bandwidth and therefore high end server options which may be a bit unfeasible. For those overly aggressive steps, thank the RIAA and their goons in governments all over the world. Youtube is surely not doing it on their own.
  18. Proper place is here: https://forum.zdoom.org/viewforum.php?f=363 How does that place look in the full GL backend for you? And what`s your hardware spacs?
  19. There's not only the disto divide but also those Linux users with their unbroken belief in "the system" that they refuse to use such repos. So while it is definitely an improvement, it still doesn't fully shield us from the problems.
  20. The mere fact that the problem existed in the first place is enough to highlight the issue. Yes, GZDoom was written against v1.0. Yes, someone added support for 2.x because some Linux distros dropped 1.0. Then some detail changed in 2.x that broke the music player. This version seemingly ended up in a package repo and chaos ensued. Whether it got fixed in GZDoom or got updated in the package repo is not relevant - the only reason this was causing problems is that instead of shipping a complete set of binaries that were proven to work we depended on the package management and ended up with a non-functional combination.
  21. The fallacy here is to think that every single user of a library can deal with the update and that each update is rigorously tested for side effects. And even then there's no guarantee because the consumer of the library may inadvertently depend on what just got 'fixed'. Not even Windows and macOS get every update right and they get developed by professional teams as homogenous products. And now consider the ramifications of this in an ecosystem consisting of hunderds of libraries developed by hundreds of independent teams. I could never properly test my app on something this volatile! What this does in reality is creating an untested package. Most software is not written in a logical fashion. It contains bugs and unintended side effects and every swapped library can bring the house down. And here's the real issue for a developer: If stuff breaks it won't be the library developers that get the brunt of complaints but the developer of the application, who most likely is the one person who cannot do anything about the problem! GZDoom ran into this as well recently with Ubuntu shipping a broken version of Fluidsynth. We are in no position to fix the problem because the only way around it would be to include our own copy - which is - guess what - not a well supported scenario, made worse by a nightmare chain of dependencies. THe Flatpak version some guy created works fine, btw. The reason should be obvious... So your lofty ideal of always patched software ultimately backfires, resulting in unstable apps, frustrated developers and as a worst case scenario no app at all because its developers call it quits.
  22. Here's the irony: I know lots of people who consider smartphones useful - I even know people who consider smart TVs useful - but I've yet to find that single person who considers the smart home useful. The entire concept seems to be pushed by an industry trying to create a market where no demand exists.
  23. That'd be the kind of people for whic everything is black and white. And since the package management is great, anything to undermine it is bad, therefore self-contained apps are an Evil to avoid. There's also this strange belief that if there's a new version of some library, it's a great feature that all apps will benefit from it when it receives an update. But in reality it's just one of those things where the lofty ideals clash with the darker side of software development, but this is a concept that just gets persistently ignored because it's a gray area.
  24. No, but those in need of Linux access may do so for convenience. The reason behind WSLwas after all so that Microsoft's own employees have easy access to Linux without leaving their comfortable UI behind.
  25. Android only succeeded because the competition was stupid. Nokia could have done something, but they stuck to an outdated form factot and a virtually unprogrammable operating system for too long - but once they got back on track they got themselves a CEO who had nothing better to do than selling out the company. MIcrosoft also could have done something - but they first screwed up the software side by making their OS (WP7) 100% incompatible (i.e. C# only, no way to get C libraries working) with everything else and not listening to any feedback and instead doubling down on a catastrophically bad UI. Blackberry took far too long to get something competitive. Their original devices were far too slow and too hard to program, so they lost their entire market share before the successor was ready. And while original Android was shit in technical terms it at least had a decent UI and (limited) native programming capabilities. So even with its insane Java API it was still the best choice.
×
×
  • Create New...