ketmar Posted March 19, 2020 offtopic: Spoiler 1 hour ago, Master O said: You must be on really, really old hardware to still be asking for a 32-bit version... i have 64-bit capable hardware. but i don't want to use 64-bit OS, because i see zero reasons for that. it is not always about hardware, sometimes it is about not following the trend/fashion. 0 Quote Share this post Link to post
ReaperAA Posted March 19, 2020 2 minutes ago, ketmar said: i have 64-bit capable hardware. but i don't want to use 64-bit OS, because i see zero reasons for that. it is not always about hardware, sometimes it is about not following the trend/fashion. But why tho (especially in this case). I can understand someone sticking to an old OS (or any other old stuff) because the old thing has something of value (like better compatibility for old programs) or you like how the old thing looks/works (like someone prefers Windows XP's looks over modern Windows etc.) But in 32-bit vs 64-bit versions of the same OS, both look and feel identical with 64-bit OS having other advantages. 1 Quote Share this post Link to post
seed Posted March 19, 2020 (edited) Besides, going x64 doesn't come at any costs since the requirements are the same, as long as the CPU is 64-bit compatible (which it likely is, unless it's coming from the stone age... ). Honestly the only advantage 32-bit might still have is compat with 16-bit programs, but that's about it... and if you're using reasonably powerful hardware staying on 32-bit is suicide, the damn thing can't even use more than what... 3/4GB of RAM? I can understand sticking to old OSes sometimes, esp if you depend on some piece of software, but staying on 32-bit in 2020 is just ridiculous to me. And I'm really the last person to follow trends or conform in general... To each their own I guess... Edited March 19, 2020 by seed 0 Quote Share this post Link to post
Davidow Posted March 19, 2020 7 minutes ago, seed said: Besides, going x64 doesn't come at any costs since the requirements are the same, as long as the CPU is 64-bit compatible (which it likely is, unless it's coming from the stone age... ). Honestly the only advantage 32-bit might still have is compat with 16-bit programs, but that's about it... and if you're using reasonably powerful hardware staying on 32-bit is suicide, the damn thing can't even use more than what... 3/4GB of RAM? I can understand sticking to old OSes sometimes, esp if you depend on some piece of software, but staying on 32-bit in 2020 is just ridiculous to me. And I'm really the last person to follow trends or conform in general... To each their own I guess... I m using 2010 hardware i know its old but i whanted to play blood on my computer gdx doesn't work well, dosbox looks and plays bad and nblood is for 64 bit (if you have nblood 32 bit please let me know) raze is the best solution, going 64 bit is very hard for my hardware 0 Quote Share this post Link to post
Graf Zahl Posted March 19, 2020 12 minutes ago, Davidow said: and nblood is for 64 bit (if you have nblood 32 bit please let me know) raze is the best solution, Keep in mind that Raze is based on the same code as NBlood, so whatever problems keep NBlood from being compiled for 32 bit will also apply to Raze. While the code appears to compile to a working binary there's simply no guarantees that it will work as expected. 41 minutes ago, ketmar said: i have 64-bit capable hardware. but i don't want to use 64-bit OS, because i see zero reasons for that. it is not always about hardware, sometimes it is about not following the trend/fashion. Doesn't "increasingly more software is going 64 bit only" count? Most developers who still maintain 32 bit compatible code will stop caring once that last remaining percent of holdouts also disappears. 0 Quote Share this post Link to post
ketmar Posted March 19, 2020 (edited) 44 minutes ago, ReaperAA said: But in 32-bit vs 64-bit versions of the same OS, both look and feel identical with 64-bit OS having other advantages. what other advantages (besides taking 8 bytes per pointer instead of 4)? ;-) 39 minutes ago, seed said: Besides, going x64 doesn't come at any costs it is not zero-cost. first, pointers are bigger, but CPU cache isn't. second, you now basically have software for two incompatible instruction sets, and it interoperates very bad. 39 minutes ago, seed said: and if you're using reasonably powerful hardware staying on 32-bit is suicide, the damn thing can't even use more than what... 3/4GB of RAM? i have 8GB of RAM, and my GNU/Linux has no problems using it all. 7 minutes ago, Graf Zahl said: Doesn't "increasingly more software is going 64 bit only" count? not for me. GNU/Linux here. if something cannot be built for 32 bits, than i would rather not use such software, it is very-very badly written. Edited March 19, 2020 by ketmar 0 Quote Share this post Link to post
Master O Posted March 19, 2020 1 hour ago, ketmar said: offtopic: Hide contents i have 64-bit capable hardware. but i don't want to use 64-bit OS, because i see zero reasons for that. it is not always about hardware, sometimes it is about not following the trend/fashion. 32-bit capable hardware is slowly being sunsetted, so you're not gonna get a choice in the coming years... 3 Quote Share this post Link to post
Gez Posted March 19, 2020 26 minutes ago, ketmar said: it is not zero-cost. first, pointers are bigger, but CPU cache isn't. You get more registers, though. 4 Quote Share this post Link to post
seed Posted March 19, 2020 54 minutes ago, Davidow said: I m using 2010 hardware i know its old but i whanted to play blood on my computer gdx doesn't work well, dosbox looks and plays bad and nblood is for 64 bit (if you have nblood 32 bit please let me know) raze is the best solution, going 64 bit is very hard for my hardware What is your hardware, then? Because if it's that underpowered, that means it wouldn't be able to run Raze anyway. Is it even GL 3.3 compatible? 1 Quote Share this post Link to post
Altazimuth Posted March 19, 2020 1 minute ago, Gez said: You get more registers, though. SSE2 support guaranteed, too. Even automatic compiler vectorisation can provide solid perf boosts. 2 Quote Share this post Link to post
Graf Zahl Posted March 19, 2020 Just now, Altazimuth said: SSE2 support guaranteed, too. Even automatic compiler vectorisation can provide solid perf boosts. These days you'd even get that for 32 bit, unless you wanted to support some really, really, really old systems that nobody cares about anymore. The biggest advantage still is the larger address space. With larger graphics assets, 2 or 3 GB can fill up quickly. 4 Quote Share this post Link to post
Altazimuth Posted March 19, 2020 14 minutes ago, Graf Zahl said: These days you'd even get that for 32 bit, unless you wanted to support some really, really, really old systems that nobody cares about anymore. The biggest advantage still is the larger address space. With larger graphics assets, 2 or 3 GB can fill up quickly. 32-bit CPUs and CPUs that can't do SSE2 are the same to my perception: on life support and longing for the embrace of death, or a curiosity to be used in a hobbyist build made to run old games on Win 9x/XP. 2 Quote Share this post Link to post
Graf Zahl Posted March 19, 2020 You won't get any disagreement from me on this. Far worse are the people who insist on running 32 bit OSs on their 64 bit CPU and creating a lot of needless additional work for software developers. 1 Quote Share this post Link to post
Davidow Posted March 19, 2020 1 hour ago, seed said: Im very grateful for your responses little sad about the situation but the issue is on me,also great work Graf keep it up im shure raze has a good future.im convinced that the 32 bit port is impossible to make but i will check out the updates on it.the fact that all of this is free is incredible. 2 Quote Share this post Link to post
seed Posted March 19, 2020 (edited) 6 minutes ago, Davidow said: Im very grateful for your responses little sad about the situation but the issue is on me,also great work Graf keep it up im shure raze has a good future.im convinced that the 32 bit port is impossible to make but i will check out the updates on it.the fact that all of this is free is incredible. Thanks but you still didn't answer my question - is your system OpenGL 3.3 compatible? Because if it isn't, a 32-bit version of Raze would change absolutely nothing. Edited March 19, 2020 by seed 0 Quote Share this post Link to post
ketmar Posted March 19, 2020 1 hour ago, Gez said: You get more registers, though. yep. but it doesn't matter alot. sure, you can win some benchmarks with that, but in real software it is mostly unnoticeable. not that i am hardly against 64 bits, i just see no real reasons to switch. it is just a rat race: "oh, we need something new to sell people! yay, 64 bits, let's do that! it doesn't matter that most people don't need it, they will have no choice soon anyway." 0 Quote Share this post Link to post
Graf Zahl Posted March 19, 2020 No, it's not a rat race. The main selling point of 64 bit is to address more memory. If that hadn't been the case it probably would have crashed and burned because there wouldn't have been any need for it. With that line of reasoning we might still be stuck with 16 bit segmented addressing mode, because when 32 bit came up in the 80's it was still "good enough", too. These days for many applications 3 GB - or even 4 - doesn't cut it anymore, but that's the hard limit of 32 bit systems. That aside, 64 bit CPUs have gone mainstream what? 12 years ago? I'm sorry but your entire reasoning here sounds a bit flawed. 2 Quote Share this post Link to post
ketmar Posted March 19, 2020 (edited) 21 minutes ago, Graf Zahl said: The main selling point of 64 bit is to address more memory. which is absolutely unnecessary for majority of tasks. 21 minutes ago, Graf Zahl said: These days for many applications 3 GB - or even 4 - doesn't cut it anymore because nobody cares anymore. that's why we have Chrome instead of browser, for example. and that's why i'm trying to use software that i wrote myself. because i do care. because my text editor can load and highlight 300MB text file in less than 1.5 seconds, and you will never see any slowdowns working with it. i won't even try to do that with vi or emacs (and electron-based crapware is completely out of the competition here). because my mail software works monthes and monthes without crashes, and it sits in 120-150 MB regardless of mailbox sizes (and i keep *alot* of archives), and it reacts on my actions so fast that sometimes it seems that it does that before i pressed a key. and so on, and so on. but of course, if people are used to bloatware, they need more than 4GB of RAM even for simpliest tasks. right now, my system is like this: 2.49G/7.82G. this is with browser, mail, two IM clients, IRC client, alot of open terminals, mail server, http server, jabber server, and some smaller services. from my PoV, bloatware is the wrong reason for upgrade. p.s.: and when i am writing "with browser", i mean that it is open for monthes too, and it has 150+ tabs (at least). because i don't know how to use bookmarks. ;-) Edited March 19, 2020 by ketmar 0 Quote Share this post Link to post
Graf Zahl Posted March 19, 2020 37 minutes ago, ketmar said: which is absolutely unnecessary for majority of tasks. Like someone said a long time ago that nobody needs more than 640kb? 38 minutes ago, ketmar said: right now, my system is like this: 2.49G/7.82G. this is with browser, mail, two IM clients, IRC client, alot of open terminals, mail server, http server, jabber server, and some smaller services. from my PoV, bloatware is the wrong reason for upgrade. I can imagine the kind of software you are using. Most people probably wouldn't want to bother with it and pay a little extra for more RAM. 1 Quote Share this post Link to post
Master O Posted March 20, 2020 4 hours ago, Graf Zahl said: No, it's not a rat race. The main selling point of 64 bit is to address more memory. If that hadn't been the case it probably would have crashed and burned because there wouldn't have been any need for it. With that line of reasoning we might still be stuck with 16 bit segmented addressing mode, because when 32 bit came up in the 80's it was still "good enough", too. These days for many applications 3 GB - or even 4 - doesn't cut it anymore, but that's the hard limit of 32 bit systems. That aside, 64 bit CPUs have gone mainstream what? 12 years ago? I'm sorry but your entire reasoning here sounds a bit flawed. Does this mean at some point we're gonna have 128-bit CPUs? 0 Quote Share this post Link to post
ketmar Posted March 20, 2020 (edited) 7 hours ago, Graf Zahl said: Like someone said a long time ago that nobody needs more than 640kb? you know that it is an urban myth, right? also, what i am saying is not even close. 7 hours ago, Graf Zahl said: I can imagine the kind of software you are using. Most people probably wouldn't want to bother with it and pay a little extra for more RAM. it won't magically turn their bloatware into something fast and good. but it doesn't matter. what matters is the tasks i described, and the amount of RAM they're using. do you have any real arguments against my point about majority of tasks and RAM usage? i have a working system. you have a speculation about "most people". Edited March 20, 2020 by ketmar 0 Quote Share this post Link to post
Graf Zahl Posted March 20, 2020 5 hours ago, Master O said: Does this mean at some point we're gonna have 128-bit CPUs? Who knows? I don't expect CPUs that handle 128 bit addresses but I think that at some point we will get wider registers - especially for floating point data, 64 bit can already become a bit tight precision wise. We already observed precision issues with UDMF node building. 1 hour ago, ketmar said: i have a working system. you have a speculation about "most people". I can't stop you from doing it your way, but then be prepared for increasing problems with getting software updated. It's just like the migration from 16 to 32 bit - at some point the modern standard will be taken for granted and you'll be forced to upgrade anyway. 0 Quote Share this post Link to post
axdoomer Posted March 20, 2020 12 hours ago, ketmar said: i have 64-bit capable hardware. but i don't want to use 64-bit OS, because i see zero reasons for that. it is not always about hardware, sometimes it is about not following the trend/fashion. Well, you should follow the trend. Windows 7 came out in 2009 and most of its users where using the 64-bits version. 64-bits systems is where people have been moving to since more than 15 years. This means that, today, developers spend less time improving the 32-bits version of their software. That also means you get less security features because everyone focuses on 64-bits systems. For example, ASLR on 32-bits systems has less entropy and suffers from more memory fragmentation. The 32-bits version of software like Chrome will have more bugs, worse performance and less security mitigations than their 64-bits counterpart. You can expect the same for Firefox and other browsers. Linux Mint and Ubuntu will even be dropping 32-bits support soon, because building these versions for what is today "low end hardware" is becoming increasingly not worth the trouble. If you look at benchmarks, you'll notice that Ubuntu 64-bit has superior performance when compared to its 32-bits version for real world tasks. Yes, this benchmark dates from 2013, but you can expect that most of the performance improvements made since then were made for people running 64-bits. 64-bits compilers have become more mature since then too, which programs got better at using all the new instructions and features that 64-bits CPU provide and that 32-bits programs can't take advantage of. 9 hours ago, ketmar said: right now, my system is like this: 2.49G/7.82G. ) If your computer knows there is 8GB of RAM, it means you are running a PAE-enabled kernel. You get a performance penalty of 0%-10% just because of the overhead of running PAE. You're probably loosing all the performance gains that you think you have just because you run a PAE-enabled kernel, so at this point it would be better to install a 64-bits OS rather than installing a 32-bits OS with PAE disabled. 6 hours ago, Master O said: Does this mean at some point we're gonna have 128-bit CPUs? 1971: Intel 4004 (first 4-bits CPU) 1972: Intel 8008 (first 8-bits CPU) 1976: TI TMS 9900 (first 16-bits CPU) 1985: Intel 80386 (first 32-bits CPU) 1991: MIPS R4000 (first 64-bits CPU) 64-bit CPUs can address up to 16'000'000 TB of memory. The quantity of RAM that the CPU can allocate grows exponentially every time more bits are added, but the average quantity of RAM people install in their computer each years scales linearly. According to Moore's law, it doubles every 18 months. One TB of RAM is still very costly (it would probably cost over $5000 USD, but since this much RAM would most probably be used in a server, it would be ECC memory which costs even more), so maybe you can imagine how far we are from having demand for 128-bits CPUs. 4 Quote Share this post Link to post
ketmar Posted March 20, 2020 33 minutes ago, Graf Zahl said: at some point the modern standard will be taken for granted and you'll be forced to upgrade anyway. why? some magical being will magically destroy all the tools and sources i am using? or i won't be able to get 32bit-capable CPU for a dime (like i can get old pentiums now, for example)? my software on my i3 and 32 bit works much faster than "modern" software on newest CPUs with 16+GB of RAM. and next generations of software will be even worse. of course, my software can be compiled for 64 bits (or 128, or 1024, or any other bitness not lower than 32). but i see no reason to do that. at least half of my system built from customised sources, and it works faster than "modern systems", and it needs less RAM to work. it was like that for 15 years, and i am pretty sure that i have at least 15 more years. 42 minutes ago, Graf Zahl said: 64 bit can already become a bit tight precision wise. We already observed precision issues with UDMF node building. WUT?! if doubles are not enough for UDMF nodes... you have BIG problems in your code. 0 Quote Share this post Link to post
ketmar Posted March 20, 2020 (edited) 7 minutes ago, axdoomer said: Windows i can't care less. ASLR is snake oil. i don't care about ubuntu. 7 minutes ago, axdoomer said: You get a performance penalty of 0%-10% just because of the overhead of running PAE. easy to check: let's look at CPU load stats: ~8% on both cores. i am so far from being at performance limits that i need special tools to see them. like useless "benchmarks". Edited March 20, 2020 by ketmar 0 Quote Share this post Link to post
Graf Zahl Posted March 20, 2020 1 hour ago, ketmar said: WUT?! if doubles are not enough for UDMF nodes... you have BIG problems in your code. Not really. double only has 52 bits of mantissa precision, but Doom's vertices are 16.16 and multiplying two such values requires 64 bits to losslessly store the value. That 12 bits can be enough to falsify the result sufficiently. 1 hour ago, ketmar said: ASLR is snake oil. I cannot say it surprises me that you are saying such a thing. Of course with your system you'll get security by obscurity by default because nobody runs such weird setups. 1 hour ago, ketmar said: easy to check: let's look at CPU load stats: ~8% on both cores. i am so far from being at performance limits that i need special tools to see them. like useless "benchmarks". Strange seeing something like this coming from a source port developer. There can never be enough CPU power when running games... ;) 1 hour ago, ketmar said: it was like that for 15 years, and i am pretty sure that i have at least 15 more years. I somehow doubt that in 15 years you'll be able to run 32 bit software anymore. It took 7 years from the first 32 bit CPUs apparing to them entering mainstream and 4 or 5 more years for them to completely eliminate 16 bit systems and OSs It took 5 years from first commercially viable consumer 64 bit CPUs to them becoming mainstream and 10 years to nearly eliminate 32 bit systems and OSs. And just like 16 bit software started dying off once the 16 bit platforms left the market, the same will happen with 32 bit. In 15 years nobody will talk about it anymore, save for the backwards compatibility mode in Windows. So you'll most likely have to run an ancient OS on ancient hardware if you want to stick with 32 bit. 1 Quote Share this post Link to post
ketmar Posted March 20, 2020 (edited) 1 hour ago, Graf Zahl said: Not really. double only has 52 bits of mantissa precision, but Doom's vertices are 16.16 and multiplying two such values requires 64 bits to losslessly store the value. That 12 bits can be enough to falsify the result sufficiently. there are actually at least two problems here. first, Doom maps doesn't need such precision for map vertices (and even for split vertices it is... questionable). and second, even if you need it, you still can do it with doubles and some trickery (floating point math is HARD!). and yeah, there is the third: in 32-bit mode we still have FPU with internal 80-bit floats. now let's talk about 64-bit superiority! ;-) 1 hour ago, Graf Zahl said: Strange seeing something like this coming from a source port developer. There can never be enough CPU power when running games... ;) ...when running Doom. ;-) sadly, it is absolutely impossible to run thinkers in parallel. and even with that, PAE/registers aren't the main problem, even if you'll compile VM to machine code, you'll still get alot of cache misses due to very random mobj/entity allocations and access patterns (and SIMD-hostile data structures). bitness and register pressure are neglible here. (disclaimer: i don't have profiling data at hand to support my claims.) 1 hour ago, Graf Zahl said: In 15 years nobody will talk about it anymore, save for the backwards compatibility mode in Windows. So you'll most likely have to run an ancient OS on ancient hardware if you want to stick with 32 bit. i see zero problems with that. software is not something that becomes worser with time, even after 15 years it is as good as it was at its release time. meh, it is even safer, because nobody's writing malicious exploits for such ancient OSes! ;-) i'm still using some software compiled 10+ years ago, and it Just Works. p.s.: one day i'll write my own nodes builder. it is just not a high priority task yet. ;-) Edited March 20, 2020 by ketmar 0 Quote Share this post Link to post
seed Posted March 20, 2020 6 minutes ago, ketmar said: i see zero problems with that. software is not something that becomes worser with time, even after 15 years it is as good as it was at its release time. meh, it is even safer, because nobody's writing malicious exploits for such ancient OSes! ;-) i'm still using some software compiled 10+ years ago, and it Just Works. Uh... no, now that's very incorrect. Once an OS reaches EOL for instance, if it has enough market share/people using it, you can bet it will be exploited for vulnerabilities to death by people out there. Sticking to ancient software is never a good thing - well okay, unless new versions are horribly broken or no longer offer something you need. 0 Quote Share this post Link to post
Graf Zahl Posted March 20, 2020 12 minutes ago, ketmar said: there are actually at least two problems here. first, Doom maps doesn't need such precision for map vertices (and even for split vertices it is... questionable). and second, even if you need it, you still can do it with doubles and some trickery (floating point math is HARD!). and yeah, there is the third: in 32-bit mode we still have FPU with internal 80-bit floats. now let's talk about 64-bit superiority! ;-) UDMF has sub-integer precision for vertices, and that's where the problems are coming from. And if someone is in dire need of 80 bit, the old FPU instructions still exist. But even for 32 bit, most C compilers cannot emit code for it. 12 minutes ago, ketmar said: ...when running Doom. ;-) sadly, it is absolutely impossible to run thinkers in parallel. and even with that, PAE/registers aren't the main problem, even if you'll compile VM to machine code, you'll still get alot of cache misses due to very random mobj/entity allocations and access patterns (and SIMD-hostile data structures). bitness and register pressure are neglible here. (disclaimer: i don't have profiling data at hand to support my claims.) Cache misses are definitely the biggest problem outside of a software renderer. That's why I cannot squeeze any more performance out of GZDoom. Each time I investigate a bottleneck, it's the cache. It's also the main reason why the scripting VM has far less impact than should be expected. 12 minutes ago, ketmar said: i see zero problems with that. software is not something that becomes worser with time, even after 15 years it is as good as it was at its release time. Good luck trying to show web sites with a 15 year old browser. Rather sooner than later you'll be cut off from the outside with such obsolete software. 12 minutes ago, ketmar said: meh, it is even safer, because nobody's writing malicious exploits for such ancient OSes! ;-) i'm still using some software compiled 10+ years ago, and it Just Works. Yes, that's the security by obscurity paradigm. In your case it may work because you are too far off the beaten path. But in general, if these systems still got critical mass they are a very tempting target for malware. 1 Quote Share this post Link to post
seed Posted March 20, 2020 (edited) 8 minutes ago, Graf Zahl said: That's why I cannot squeeze any more performance out of GZDoom. But there still are other ways, aren't there? This PR by MC seemed interesting enough, apparently aimed at improving performance. Edited March 20, 2020 by seed 0 Quote Share this post Link to post
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.