Jump to content

Graf Zahl

Members
  • Posts

    12973
  • Joined

  • Last visited

Everything posted by Graf Zahl

  1. Sorry, Lila Feuer, but that's classic armchair advice you were giving here. Where I work there's one guy whose sole responsibility is to think up new test cases and run our software through all sorts of ludicrous scenarios. But he's doing nothing else the entire day, 5 days a week, 52 weeks a year minus vacation. But for something that's ultimately a hobby project on a limited time budget with too few people working on it that's totally impossible, even on a smaller scale. We inevitably have to depend on users to report when they see something wrong.
  2. What can you do if nobody reports it for all the time...? I never noticed because I don't like playing the Cleric.
  3. RNG tables were also changed in PrBoom. Funny that it is constantly overlooked that this change dates back to Boom.
  4. What I'd like to see more than a Q1 port would be a Q2 port that takes an active effort to convert the game DLLs into a scripted language so that the game can live on without being inconvenienced by a binary extension interface.
  5. Binary patches of executables always come with a risk, though, and that's not limited to exploits. Just have a look at Quake 2. Thanks to its DLL interface, any port switching to a different architecture other than 32 bit Windows is left in the cold with many of those old user mods which shipped their own game DLL but didn't bother including the source.
  6. If it didn't "change any of Doom's internals" it would not work. The entire purpose is to do that! And in this regard it is completely irrelevant whether you apply it statically to the .EXE file or through an exploit at run time. Seriously, the only new thing here is the delivery method. Patching an executable to do thiings it was never made for is an ancient concept - remember Entryway's limit epanding modified EXE?
  7. It's more a question of feasibility. The way this thing works you need to create machine code that's hardwired to the DOS EXEs intricacies. Trying to execute that code in a different context will be a gargantuan effort that'd only be worth it if there was a genuine gain. And a handful of projects isn't really it. My guess is that the scope of these changes will be on a scale where it is easier to reverse engineer them to source code (unless provided directly, of course) and integrate that code directly into the port or script it in a port capable of instead of trying to run the actual machine code.
  8. Obvoiusly any port trying to support things like this would have to check for the overflow condition and deal with it on its own terms instead of letting it run amok on the system. But the real problem lies elsewhere. It would only work on x86 32 bit, but that's not really a future proof platform. So to support it on other architectures you need an emulator - but you not only need an emulator but a code checker that blocks malicious attempts to access functions and system calls that can cause real damage. It also needs total knowledge of where functions start, especially in cases where some jumps into the middle of a function are performed. The amount of work needed can quickly exceed any reasonable effort. So sorry for anyone dreaming of getting this to work with more modern ports: It won't happen.
  9. No, the firestorm also was fixed recently, its parameters for radius damage were not correct.
  10. I'm sorry to tell you, but you must really be imagining things here. I've run some tests on both GZDoom and Boom's RNG - same as in its child ports - in the past and while the actual sequences are not the same, the resulting distribution is very similar. It's both a lot more random than Doom's original one. I think what you really feel here is that GZDoom does not use a single RNG but hundreds - a lot more than even Boom and as a result you get poor sequences for some events. If you really want I can do a test build for you switching back to Boom's RNG, which ZDoom had been using in the beginning, and mapping all calls to a single one for SP games. It might be interesting to hear your impressions.
  11. Here's the right place for making requests on those "lacking" aspects: https://forum.zdoom.org/viewforum.php?f=15 ;)
  12. Although, truth be told, this had nothing to do with Build, but with a developer team that obviously - judging from a famous message - had problems getting along with the provider of their engine.
  13. If you look at the entire Blood source you will see this attitude everywhere - the map format is different (hopelessly convoluted with bit fields for properties, also encrypted), the palette management does not use the standard features, it overrode several engine functions with their own, courtesy of the linker prioritizing their own content, etc. Although with the file system the actual reason was that GRP was added to the engine relatively late and since the Blood team had already set themselves up to use their stupid resource indices (another thing that makes it very hard to replace stuff) they just circumvented the GRP code, which admittedly was a piece of shit all of its own.
  14. E1. The rest strongly suffers from the limited set of textures the game has.
  15. The Microsoft model depends on local play, it's not just an interactive video stream. As such, it's a business model I'd find more interesting than buying renting DRM-crippled games but pay full price. At least it is honest about its intentions. Too bad then that the speeds that need increasing can't be. The problem is not a question of bandwidth - it's a question of how long a single bit from your system takes to reach the server, and like I pointed out, that's subject to physical limitations - the further away your server farm is from your customers, the longer the bits take to travel back and forth between your system and the server and the more lag the customer will experience.
  16. Those people who, according to the industy, "have no need for a real computer". The story probably goes like "Hey, in order to play new games you constantly have to buy upgrades to your home computer. Why not come to us? - all you need is some cheap hardware that's capable of decoding a video stream." "If there is no market, we will create one." Sometimes there's a genuine Cloud-Cuckoo-Land mentality around there, it's very clear that these IT guys often have no idea how real people tick. I think a big part of the master plan is to take control of as much as possible before our dear politicians finally wake up and try to combat the monster the had allowed to grow. But seriously, there's definitely a market for gaming subscriptions. The problem is that streaming is not the solution for it because it simply cannot work due to the technical limitations.
  17. Heh, yes, and it was inevitable that it totally sidestepped the one problem they'll never be able to overcome. Can't beat the hard limits of universal physics. What they totally forget is that unlike movies, gaming is a two-way affair - data goes in and out. With movies it doesn't matter if the data travels for a second or two, what is important is that you get an uninterrupted stream. But with games it is of the utmost importance that no time at all is lost for data transmission which is a physical impossibility. Yup, most definitely.
  18. Typical hogwash. Like nearly everybody invested in this they ignore the fundamental technical issues. The only really interesting thing in that article was the statement about Apple who again abuse their power to block competitors, the rest was all the same type of pipe dream that seems to run rampant among these internet types who have no clue about the inherent limitations. The input lag problem is not easy to overcome - even with dedicated multiplayer clients of today you need movement prediction and other kludges to reduce its impact, i.e. you need a local client that know the game it plays. But with these services the client is merely a dumb terminal only receiving a video stream. As a result the lag will hit with full force and with long distance connections being a necessity here it should be obvious how this will play out. And these people may repeat endlessly that this can be overcome - it can't! This is bound by the speed at which information travels through the internet, which is finite (from what I read, roughly 200'000 km/s, 2/3 of the speed of light.) So imagine your server is 2000 km away. That will result in 1/50 s of lag (have to factor in both ways of travel!) that's impossible to remove - for gaming that presumably measly time can already become a problem because the game won't feel snappy anymore. Of course that time is merely the travelling time of your information. The data will get routed through multiple intermediate nodes that all add their own delays to this time so you'll never even match that 1/50th of a second. Not even the fastest internet connection in the world will be able to overcome a hard physical limit! So, you have to essentially keep your gaming server very close to your customers to make this work. Does it now become clear why this is a fool's proposal?
  19. The current info is that this corruption only happens on NVidia with recent drivers and only on Windows 10. But so far no pattern could be recognized that may trigger it.
  20. Blood's dynamite gives an incredibly satisfying feeling. Nothing comes really close.
  21. The simple answer for "why this lower cap of 64" is "to prevent users from accidentally crippling their config and sending false bug reports."
  22. It always depends on what the "service" is. For example, most viewers only watch movies or TV series once, it makes perfect sense for those to subscribe to a streaming service and would be cheaper than buying these things on DVD or BluRay. The same is also true for many gamers, they'd be more than happy if they could just pay a monthly fee and play some games. The issue here is that unlike these services where only the data gets streamed, playing games online requires quite intricate interaction with the server and that's where the whole thing will fail. Of course, trying to sell games hosted on such platforms is nothing short of fraud. Either I own something, then I have control of it or I don't, but that's not a sale but a rental - and I won't pay full price for it, that's why I no longer buy new PC games. This online gaming thing is the end goal of what started with online-DRM, it was inevitable that we get here. It's just good that they'll never be able to remove the lag altogether - they may reduce it but that's really it.
  23. You should not copy two games into the same folder, unless it's all complete GRP files. Blood is not like that - it is all loose files, which Duke would load if it got started from the same directory. Welcome to Build, indeed!
  24. Short explanation what's up with the menus. I have changed the texture system to be more like GZDoom's so that when I get to defining a new map format I can use named textures instead of these unwieldy tile numbers. The 2D code already works on that texture system, but the hires replacements are still attached to tile indices, so the 2D code cannot find them. It's a somewhat larger change and I deemed it more important to first get feedback on the new version. My bigest problem right now is the corruption, as it only appears to happen on Windows 10 with the latest drivers and I have no access to a system where I could test it on. My main working machine runs on Windows 8 which even with the latest driver does not show this corruption and the only Windows 10 machines I could use either run on an Intel or AMD integrated chipset, where the bug does not happen.
×
×
  • Create New...