Graf Zahl Posted September 20, 2021 12 hours ago, Gustavo6046 said: I think I will side with Graf here, unprecedented move I know. Windows is dead already, guys, and for the most part legacy compatibility was just dragging their development and barely helping but the niche of people who wanted to run old stuff but didn't want to port it to Linux. Yes, yes. We've been hearing this for what? 15 years by now? And it won't change for the next 15 years as well, because Linux will never ever manage to get a problem-free desktop experience together with its hodgepodge of independent libraries that do not form a coherent whole. No. Windows 11 won't fail. At least not for the upped hardware requirements. All that will mean that initial uptake will take a bit longer than if it was universally compatible. Windows had been declared dead multiple times already. The driver model change for Vista was deemed a breaking moment, but once the hardware vendors adapted, all was fine again and 7 became a major success. The UI changes in 8 were deemed a breaking moment, and while this part never had any success, the UI tweaks in 10 were enough to get it back on track and workarounds for the defective design were quickly available after launch. So neither will be the higher hardware requirements. Like before, the market will adapt and Windows will be fine again. However, once most of the older systems have been driven out, these upped hardware requirements will become one of 11's strongest assets because this essentially means that the OS can be cleaned off a lot of cruft needed for old hardware support. 6 Share this post Link to post
Murdoch Posted September 20, 2021 (edited) 9 hours ago, Graf Zahl said: Yes, yes. We've been hearing this for what? 15 years by now? 15 years at least. @Gustavo6046 sorry but to declare Windows dead is frankly absurd. Linux has made great strides, but it is still nowhere close to creating the consistent and reliable interface that Joe Average User (ie: not you) needs. Inexperienced users don't care about what this distro or the other of Linux does. They want things to be as consistent and familiar as possible. They dislike change. Yes, I know Windows 11 is going to bring some pretty big changes, but it's still less of a change than a shift to a completely different OS. As Graf said, for better or for worse, Windows is here to stay. It's going to be the OS of choice for the vast majority for many many years to come. Edited September 20, 2021 by Murdoch 5 Share this post Link to post
Wadmodder Shalton Posted September 20, 2021 (edited) 5 hours ago, Graf Zahl said: The driver model change for Vista was deemed a breaking moment, but once the hardware vendors adapted, all was fine again and 7 became a major success. Actually two different driver models for consumers existed in the 1990s & 2000s, the old VxD (or .386 file extension) system for 16-bit Windows & Windows 9x systems, with the other half being the Windows NT Driver Model which later evolved into the Windows Driver Model (WDM) & the Windows Driver Frameworks (WDF; and otherwise formerly known as Windows Driver Foundation) for Windows NT systems. All drivers for graphics & sound cards, network devices, removable storage and other stuff since then are now exclusively developed with WDM & WDF, with VxD now being considered obsolete and no longer being valued by hardware manufacturers and no longer having exclusive drivers. A few different display models existed for consumer Windows NT systems during the 2000s, these were the, Windows 2000 Display Driver Model (XDDM), Windows XP Display Driver Model (XPDM) & the current Windows Display Driver Model (WDDM). Windows 8 dropped support for XDDM (maybe even XPDM as well?) with only WDDM being the current option. Edited September 20, 2021 by Wadmodder Shalton 0 Share this post Link to post
Graf Zahl Posted September 20, 2021 So, among all the hubbub about the system requirements, what about the UI changes in Windows 11? From what can be read they thoroughly managed to cripple the task bar to become a useless accessory, similar to macOS's dock which is nothing more than a glorified shortcut for their Launchpad. Makes me wonder if there also will be third party replacements like for the start menu. I'm surely going to use OpenShell again to get a sane start menu back once I upgrade. 1 Share this post Link to post
Cacodemon345 Posted September 20, 2021 18 hours ago, Gustavo6046 said: I think I will side with Graf here, unprecedented move I know. Windows is dead already, guys, and for the most part legacy compatibility was just dragging their development and barely helping but the niche of people who wanted to run old stuff but didn't want to port it to Linux. News flash: Plenty of stuff and Windows-isms the average Linux users hates are right there existing in the Linux ecosystem widespread. GSettings being Linux's equivalent of Windows Registry? Check. Wayland getting forcibly shoved down the average Linux user's throats? Check. systemd becoming widespread across distros with most of them not offering any alternatives despite violating UNIX principles? Check. There are also other factors to take into account, like thousands of solutions getting engineered over Linux app containers and there existing two mainstream UI toolkits for Linux that is GTK+ and Qt with former being described as an API evolution disaster and the latter having the infamous copyright assignment requirement and commercial versions. Add to the fact that getting an old Linux game working under it requires putting far more effort that the average user is incapable of and the fact that unlike in macOS you need to type in your password to get shit installed should make you understand why Linux still hasn't ever taken off. Those also aren't issues solvable with marketing; people will just install Windows instead. The only thing that can take off is Linux gaming and that will be because of Steam Deck. And that's because Steam Linux games tend to link locally exclusively so that they don't have issues on differing distro configurations. Windows is here to stay for a long time because people can make mod loader frameworks easy to use for the end user on the platform and because of its backwards compatibility promise. 0 Share this post Link to post
Cacodemon345 Posted September 20, 2021 31 minutes ago, Graf Zahl said: So, among all the hubbub about the system requirements, what about the UI changes in Windows 11? From what can be read they thoroughly managed to cripple the task bar to become a useless accessory, similar to macOS's dock which is nothing more than a glorified shortcut for their Launchpad. Makes me wonder if there also will be third party replacements like for the start menu. I'm surely going to use OpenShell again to get a sane start menu back once I upgrade. There will be for the start menu but the taskbar is considered non-modifiable by Microsoft. However I expect the taskbar to have all of its functionality gradually restored since it's running on a new animation engine and I'd expect Microsoft to complete the switch to it over time. One thing I noticed is that windows under Windows 11 now show proper scaling animations after 14 years and 10 months of Windows Vista release. And the rounded aspects of Windows Vista/7 UI is also back. Also nice to see transparent blurred windows making a comeback. 0 Share this post Link to post
Blzut3 Posted September 21, 2021 13 hours ago, Graf Zahl said: I can't say that my experience is "technically minimal" impact. At work I use a Mac because my main target is iOS, but occasionally I need to use Visual Studio as well. And despite this system being significantly faster than my home PC, Windows feels very sluggish when run in the VM, so either the impact is not minimal or the VM infrastructure on macOS sucks. I would say that macOS's hypervisor infrastructure is objectively worse than Windows and Linux. I'm legit surprised at how many companies I've seen using macs for Linux development with Docker Desktop since it's crazy inefficient for that. That said there are a lot of variables at play here. I don't know what hypervisor you're using, but I hope it's VMWare Fusion or Parallels. VirtualBox truly is terrible performance wise on macOS especially with multicore VMs (at least under the test conditions I used which involved a decent amount of network traffic). I can't say that under fusion the Windows VMs on the 6 core Macs that I'm using build any slower than I would expect them to, although there's probably some improvement to be had on my setup given that I'm not using fixed size disks. I've also found shared folders to have limited throughput so keep that in mind if you're using them. Although my expectations might be less than yours since I've always found Windows to be relatively slow at compiling code even on bare metal installs. I have heard anecdotally from a coworker that Parallels is much better than Fusion these days, but I haven't personally compared it. Now it does seem like the Mac hypervisors have less paravirtualization support compared to KVM and Hyper-V so there's likely always going to be some overhead in storage and/or networking. Raw CPU speed shouldn't be hurt much though, which is why PCIe passthrough is able to basically eliminate the downsides of VMs. Of course if you pass through GPU, network, and storage, you've basically eliminated all the up sides too, but it can be done on sufficient hardware. Also worth noting that starting with Big Sur Apple has been putting in some restrictions on what hypervisor software can do. 0 Share this post Link to post
Graf Zahl Posted September 21, 2021 4 hours ago, Blzut3 said: I would say that macOS's hypervisor infrastructure is objectively worse than Windows and Linux. I'm legit surprised at how many companies I've seen using macs for Linux development with Docker Desktop since it's crazy inefficient for that. Why am I not surprised hearing that? 4 hours ago, Blzut3 said: That said there are a lot of variables at play here. I don't know what hypervisor you're using, but I hope it's VMWare Fusion or Parallels. VirtualBox truly is terrible performance wise on macOS especially with multicore VMs (at least under the test conditions I used which involved a decent amount of network traffic). I can't say that under fusion the Windows VMs on the 6 core Macs that I'm using build any slower than I would expect them to, although there's probably some improvement to be had on my setup given that I'm not using fixed size disks. I've also found shared folders to have limited throughput so keep that in mind if you're using them. I am using VMWare Fusion with Windows 10 in the VM. The only software I frequently use in that VM is Visual Studio, and compared with my private computer feels significantly less performant. It's not just compilation but overall the entire experience does not feel optimal. 4 hours ago, Blzut3 said: Although my expectations might be less than yours since I've always found Windows to be relatively slow at compiling code even on bare metal installs. I can't say that. For me Visual Studio has always been a very fast and performant compiler. The only compile time related issue I ever had is that the precompiled header setup in GZDoom and Raze tends to break when the number of source files becomes large. 4 hours ago, Blzut3 said: I have heard anecdotally from a coworker that Parallels is much better than Fusion these days, but I haven't personally compared it. That may be, but I only got a license for VMWare Fusion from my employer, so... 4 hours ago, Blzut3 said: Now it does seem like the Mac hypervisors have less paravirtualization support compared to KVM and Hyper-V so there's likely always going to be some overhead in storage and/or networking. Raw CPU speed shouldn't be hurt much though, which is why PCIe passthrough is able to basically eliminate the downsides of VMs. Of course if you pass through GPU, network, and storage, you've basically eliminated all the up sides too, but it can be done on sufficient hardware. 4 hours ago, Blzut3 said: Also worth noting that starting with Big Sur Apple has been putting in some restrictions on what hypervisor software can do. Yay, Apple... :? 0 Share this post Link to post
Cacodemon345 Posted September 21, 2021 8 hours ago, Blzut3 said: Also worth noting that starting with Big Sur Apple has been putting in some restrictions on what hypervisor software can do. Which ones? Right now I only saw mentions of requiring an entitlement to use the hypervisor. 0 Share this post Link to post
Maes Posted September 21, 2021 (edited) On 9/20/2021 at 1:14 AM, Gustavo6046 said: Is that really the case? There has been a lot of development in virtualization technology lately, I wouldn't be surprised if hypervisors are now running close to bare metal speed, and even supporting native(ly bridged) 3D acceleration on top of that. VMs for gaming is becoming more and more of a reality. AFAIK the main problem with VMs is not CPU speed itself -after all, that's exactly the one advantage they have over a traditional emulator-, but that they suck at maintaining accurate hardware subsystems' timing, which is critical if you're trying to run real-time systems or games in them, especially the old single-tasking kind (like DOS). Simply put, that kind of software runs all over the place inside a VM, either too slow or too fast but usually too slow and at an inconsistent rate, esp. video sync. When your VM becomes so specialized that it can handle that sort of software, then we're talking more about a specialized emulator like DOSBOX, and you start drifting away from what a VM is meant to do (provide virtualization/isolation for "clean" programmed stuff, like productivity software, with minimal system overhead compared to a full-fledged emulator). Audio seems to be the only aspect that VMs get kind of right, and that's probably just because they use buffers and can negotiate a constant playback sample rate with the "outside world", something that you cannot really do with video, and not because they have down-to-register accurate timing emulation of a GUS or a SoundBlaster (again, that's what you'd expect from DOSBOX). 10 hours ago, Graf Zahl said: I can't say that. For me Visual Studio has always been a very fast and performant compiler. The only compile time related issue I ever had is that the precompiled header setup in GZDoom and Raze tends to break when the number of source files becomes large. Incidentally, the same kind of problem occurs with gfortran and Intel Fortan compiler under VS -Fortran uses "modules" which are essentially a cross between a traditional object file and a precompiled header in the newfangled (for C++) sense. TBQH I never liked the concept, as it introduces source dependencies of one source file to others, and forces a specific compilation order on the entire project, reducing build parallelization and eventually breaking if complexity/sheer number/repetition of "USE" statements (the rough equivalent of an #include ) becomes too large. Edited September 21, 2021 by Maes 0 Share this post Link to post
Blzut3 Posted September 21, 2021 (edited) 12 hours ago, Cacodemon345 said: Which ones? Right now I only saw mentions of requiring an entitlement to use the hypervisor. Well requiring the use of Apple's hypervisor instead of allowing third party kexts is in itself one such restriction. Along with that comes the requirement to use vmnet and macOS's firewall to implement virtual networking. Just search for people having issues with mixing VPNs and VMs on Big Sur for the implications of this. I have heard that Parallels has worked around the VPN issue, but VMWare so far as just acknowledged its an issue and hasn't done anything about it since it's unclear who is supposed to solve the issue (Apple, the hypervisor developer, or the VPN client). Another issue my coworkers have run into is it also removes control of the DHCP server to macOS which makes static assignment of VM addresses difficult. I don't know enough to comment on if this move will be great in the long run, but what I can say right now is that they are creating restrictions that makes Catalina somewhat better for VMs than Big Sur with the current software. Edited September 21, 2021 by Blzut3 0 Share this post Link to post
Graf Zahl Posted September 22, 2021 8 hours ago, Blzut3 said: I don't know enough to comment on if this move will be great in the long run, but what I can say right now is that they are creating restrictions that makes Catalina somewhat better for VMs than Big Sur with the current software. Unfortunately that doesn't really help because XCode already requires Big Sur, so all of us developers are screwed if they need such a combination. Funny that none of this generates any big news, but the moment Microsoft farts it's all over the place in no time. 0 Share this post Link to post
wallabra Posted September 22, 2021 On 9/20/2021 at 2:58 PM, Cacodemon345 said: News flash: Plenty of stuff and Windows-isms the average Linux users hates are right there existing in the Linux ecosystem widespread. GSettings being Linux's equivalent of Windows Registry? Check. Wayland getting forcibly shoved down the average Linux user's throats? Check. systemd becoming widespread across distros with most of them not offering any alternatives despite violating UNIX principles? Check. There are also other factors to take into account, like thousands of solutions getting engineered over Linux app containers and there existing two mainstream UI toolkits for Linux that is GTK+ and Qt with former being described as an API evolution disaster and the latter having the infamous copyright assignment requirement and commercial versions. Add to the fact that getting an old Linux game working under it requires putting far more effort that the average user is incapable of and the fact that unlike in macOS you need to type in your password to get shit installed should make you understand why Linux still hasn't ever taken off. Those also aren't issues solvable with marketing; people will just install Windows instead. The only thing that can take off is Linux gaming and that will be because of Steam Deck. And that's because Steam Linux games tend to link locally exclusively so that they don't have issues on differing distro configurations. Windows is here to stay for a long time because people can make mod loader frameworks easy to use for the end user on the platform and because of its backwards compatibility promise. Apologies for my preposterousness, but I have to say this because this is essentially misinformation and it's not like you're doing any better anyway. You clearly don't know what you're talking about: GSettings and GTK are part of the GNOME ecosystem, not GNU/Linux. GTK is not an "API evolution disaster" – that'd be wxWidgets and its horrendous slew of breaking incompatibilities introduced every minor version. I think, of the three GUI frameworks, I'm fondest with GTK, it has a GUI design application (Glade) and a OOP-ish programming language made to compile to C that uses GObject (Vala). And even still, there's way more than just those traditional libraries floating around – go take a look! For instance, have you ever heard of dear-imgui? Wayland is not "forcibly shoved down" anyone's throats. Quite the opposite – X.org is sadly still the standard for the vast majority of Linux distros. Wayland has just been in development too slowly, especially given its age. It needs to hasten. systemd is indeed pretty shitty, but while I will give you credit that it's very, very non-trivial to switch from it to a different init system on distributions that already use systemd, there is a better alternative – it's called OpenRC, and it's already used by distributions like Alpine Linux and Gentoo Linux. OpenRC is lightweight-er and doesn't try to dominate every aspect of your system from networking to date-times. Heck, almost everything in a GNU/Linux system is swappable and has alternatives. Can't say that for Windows. You see the problem now? Even with all the drawbacks of the current(ly mediocre) Linux userspace ecosystem, it is a rapidly evolving one – with a massive community, bigger than the BSDs (which I kind of prefer) could ever dream to achieve, inevitably there is a lot of man-hours spent checking, writing and contributing code. Maybe that is part of the issue – with so much development at such a fast pace, everything becomes a race, and stability is jeopardized. Getting a Windows game working under Linux is usually a breeze thanks to Wine, and even more so nowadays thanks to Steam's Proton. Even with Wine, the performance is near-native, with the exception of things like graphics instruction translations (like Direct3D via WineD3D is old and buggy, so try to use OpenGL or, better yet, Vulkan, whenever possible). Old Linux programs often don't work because libraries keep being updated and security keeps being patched. If you want to run them, REBUILD THEM FROM SOURCE. Most Linux programs are open-source, let alone free and open-source software (FOSS), so that's what you're meant to do. That's the main advantage Linux has. Its source is out there, which means a lot more than you think. Can't say that of Windows, even though Microsoft is at least starting to go that way (personally it smells like Embrace-Extend-Extinguish, although that may as well just be fearmongering of mine – we'll see!). Linux development is much better than Windows development, and best of all, it doesn't cost a dime. Sure Code::Blocks is free, unlike Visual Studio. Well, newsflash: Code::Blocks is also available for Linux, and there, to get all your favourite libraries and compilers you can simply use your package manager! No risk of a DLL hell or of scattering files around and stuff! But nah, command-lines are scary and they make me soil my diapers! Windows will stay for a long time, but for every shady, shitty, scammy update that Microsoft puts out (I mean, Windows 11 really sucks for a lot of reasons - and no, not supporting 32-bit processors is not one of them, kids!), another wave of people will discover and try Linux. Hopefully they'll steer clear of Ubuntu and try something like OpenSUSE. (Canonical is shady.) Then they'll have their minds blown. Unless they're American, of course. 0 Share this post Link to post
Edward850 Posted September 23, 2021 1 hour ago, Gustavo6046 said: Sure Code::Blocks is free, unlike Visual Studio. I have never once paid for Visual Studio. And not due to any grey market keys or anything. Like legit I don't think you've ever actually used Visual Studio. 4 Share this post Link to post
Blzut3 Posted September 23, 2021 3 hours ago, Gustavo6046 said: systemd is indeed pretty shitty, but while I will give you credit that it's very, very non-trivial to switch from it to a different init system on distributions that already use systemd, there is a better alternative – it's called OpenRC, and it's already used by distributions like Alpine Linux and Gentoo Linux. OpenRC is lightweight-er and doesn't try to dominate every aspect of your system from networking to date-times. This is like saying GNOME is bad because it tries to be your media player and your web browser. There's a difference between systemd the umbrella project and systemd the init system, and most of the people I see complaining about systemd seem to not understand that. You can use systemd the init system without timesyncd or networkd (in fact networkd hasn't seen much adoption) and no neither has ever been part of PID 1. There's a reason that systemd the init system has won out over alternatives, and that is it makes life easier for those who are maintaining the vast majority of the init system (i.e. your distro developers). Yes, it can do a lot for you, but these are just things that you'd do manually in your RC scripts. Rewriting the same thing over and over again produces bugs and inconsistent behavior. That's what systemd solves. I'm not going to say there definitely aren't ways to do it better, but I don't think going to arbitrary imperative scripts is it. 3 hours ago, Gustavo6046 said: Old Linux programs often don't work because libraries keep being updated and security keeps being patched. If you want to run them, REBUILD THEM FROM SOURCE. Most Linux programs are open-source, let alone free and open-source software (FOSS), so that's what you're meant to do. Yes, that's a wonderful answer to give someone complaining about not being able to run proprietary software on newer versions of Linux easier. The funniest thing to me is that all the infrastructure work is in place for good backwards compatibility, it's just this mentality of "just rebuild the world" that holds distros back from offering old libraries to those who want them. Old libraries could today be offered on an as-is basis, but they're not and so people get the impression that they can't run old software on newer distros. Personally I can get the Loki games to run on modern Linux fairly easily, but my ability to do that comes from a ton of domain specific experience which most people don't have. For better or for worse this attitude is why everyone wants to containerize everything regardless of if it makes sense. If the distros are going to stick to their security over everything else philosophy, then people will just effectively static link everything. But I guess all the security vulnerabilities are in a cgroup so yay? 0 Share this post Link to post
Maes Posted September 23, 2021 (edited) Well, from the stuff that gets flung against Linux in this thread, it seems not much has changed in 10+ years,when I had posted an old thread on the subject, which I can't be arsed to find. But the jist of it was that for everything that ails you in Linux "just compile from source". If you cannot, then it's not worth using. Precompiled binary distributions work only some of the time, have to be distro-specific unless we're talking about very clean, LSB-compilant stuff (exactly the opposite of what you need a typical game to be...), and even then hardcore Linux gurus would not be happy because, after all, The Only Way is to Compile From Source. Did I mention that you'd better compile from source? Edited September 23, 2021 by Maes 1 Share this post Link to post
Graf Zahl Posted September 23, 2021 4 hours ago, Blzut3 said: Yes, that's a wonderful answer to give someone complaining about not being able to run proprietary software on newer versions of Linux easier. The funniest thing to me is that all the infrastructure work is in place for good backwards compatibility, it's just this mentality of "just rebuild the world" that holds distros back from offering old libraries to those who want them. Old libraries could today be offered on an as-is basis, but they're not and so people get the impression that they can't run old software on newer distros. Personally I can get the Loki games to run on modern Linux fairly easily, but my ability to do that comes from a ton of domain specific experience which most people don't have. For better or for worse this attitude is why everyone wants to containerize everything regardless of if it makes sense. If the distros are going to stick to their security over everything else philosophy, then people will just effectively static link everything. But I guess all the security vulnerabilities are in a cgroup so yay? Well, that's where both Windows and macOS ended up with their app installations. I wonder why... Maybe it isn't such a bad thing after all that the app just ... works ...? I think it has been conclusively proven by now that dynamic linking to libraries that need to be provided by external parties is not a viable mode of operation. And yet, when it comes to Linux, the first thing some people want to change about others' projects is how they link their dependencies, never mind that these settings were done for a reason. 23 minutes ago, Maes said: Well, from the stuff that gets flung against Linux in this thread, it seems not much has changed in 10+ years,when I had posted an old thread on the subject, which I can't be arsed to find. But the jist of it was that for everything that ails you in Linux "just compile from source". If you cannot, then it's not worth using. Very true. But it isn't a big step from here to the opposite view: "If Linux cannot provide stable app installations it isn't worth using." And I'm sorry to say, that's where it is for many, many potential users. 25 minutes ago, Maes said: Precompiled binary distributions work only some of the time, have to be distro-specific unless we're talking about very clean, LSB-compilant stuff (exactly the opposite of what you need a typical game to be...), and even then hardcore Linux gurus would not be happy because, after all, The Only Way is to Compile From Source. Did I mention that you'd better compile from source? Proving my often-stated view that Linux's biggest problem is its core user base (i.e. what you call 'gurus'.) My suspicion here is that a large part of these people come from academia and often have lost track of how the real world ticks. They have their deadlocked views from within their ivory tower but nothing they see from there matters where 'real work' is done with computers - and this does not just affect Linux - IMO it's so bad that most of what these people say has to be taken with a grain of salt. I've lost count about the nonsensical predictions that came out of this corner over the last 20+ years. But since they are some core opinion makers it is inevitable that the sheep will follow their ways, screw that this isn't doing anyone a favor. The sad thing is that all this could be solved and make Linux a viable system to fight the Microsoft/Apple duopoly on the desktop but they obviously prefer to tear themselves apart in these ridiculous turf wars that lead nowhere. And no this will not happen unless these people get their act together - not that I have any hope they'll ever do: 8 hours ago, Gustavo6046 said: Windows will stay for a long time, but for every shady, shitty, scammy update that Microsoft puts out (I mean, Windows 11 really sucks for a lot of reasons - and no, not supporting 32-bit processors is not one of them, kids!), another wave of people will discover and try Linux. Hopefully they'll steer clear of Ubuntu and try something like OpenSUSE. (Canonical is shady.) 2 Share this post Link to post
Maes Posted September 23, 2021 44 minutes ago, Graf Zahl said: The sad thing is that all this could be solved and make Linux a viable system to fight the Microsoft/Apple duopoly on the desktop but they obviously prefer to tear themselves apart in these ridiculous turf wars that lead nowhere. That's exactly what Apple did with MacOS X, providing a better "Linux/Unix for the masses". Too bad that it works only on THEIR computers (Hackintoshes notwithstanding). And that the end result was further cementing the duopoly because, guess what, it existed even before MacOS X came to be, albeit with Apple's fixed 10% desktop market share that they kept for most of the 1980s and 1990s. Tertium non datur, unfortunately. 1 Share this post Link to post
Rakuen Posted September 23, 2021 4 hours ago, Graf Zahl said: The sad thing is that all this could be solved and make Linux a viable system to fight the Microsoft/Apple duopoly on the desktop but they obviously prefer to tear themselves apart in these ridiculous turf wars that lead nowhere. And no this will not happen unless these people get their act together - not that I have any hope they'll ever do: Obligatory xkcd's comic. 1 Share this post Link to post
Graf Zahl Posted September 23, 2021 It's not about standards but about options. Both Microsoft and Apple are nasty, vile behemoths that have long forgotten that customer satisfaction is a thing. (The Windows 11 announcement is a textbook example for that!) And yet they have the entire market to themselves, with some of the Linux distros even playing catch-up with their vile-ness. 4 Share this post Link to post
ketmar Posted September 23, 2021 9 hours ago, Graf Zahl said: The sad thing is that all this could be solved and make Linux a viable system to fight the Microsoft/Apple duopoly on the desktop trying to achieve this already damaged the system a lot. shit like wayland, systemcrash, pshhhaudio, etc. sadly, GPU lock-in blocks me from running away. but having sane system without that "let's make it better for desktops" crap is harder and harder every day. 0 Share this post Link to post
Graf Zahl Posted September 23, 2021 That's maybe because none of the things you listed is in any way relevant for "Let's make the desktop better". It's just the same hodgepodge all over again that's dragging Linux down. What a functional desktop system needs is a coherent UI layer that provides a fully functional API needed to develop desktop solutions. But yeah, that probably won't be GNU/Linux, but Desktop/Linux. We still need such a thing to get out of the Microsoft/Apple stranglehold. As long as Linux treats the desktop as an afterthought it won't be 'discovered' by the masses. 1 Share this post Link to post
Blzut3 Posted September 24, 2021 7 hours ago, Graf Zahl said: That's maybe because none of the things you listed is in any way relevant for "Let's make the desktop better" I don't follow, all the components listed are definitely part of making the desktop behave the way that users expect it to. Of the three listed wayland is perhaps the most debatable, although it seems like since Ubuntu started switching we're seeing a lot more movement on fixing the remaining glaring issues so we'll see. Jury's still out but I had some pleasant surprises like the Loki port of Descent 3 working better under XWayland than on native X11 while I was checking it out. (Since that game is particularly quirky given that it was designed for CRTs and running in full screen.) Admittedly the GUI tools are lacking for PulseAudio. Which isn't to say that there aren't GUI tools (every DE has one these days), but as someone that uses the more advanced features of PA there are some things that I wish could be done without breaking out the command line. For example routing one device's input (say capture card) through and other devices output (your speakers) is as far as I know still only possible with pactl. The API is all there, I'm guessing it just isn't done since although I think creating loopbacks would be a useful feature to a lot of people, it is likely true that most people only have one sound device in their system so I can't blame developers too much for being indifferent. Been in a few debates about PA where people had no idea what PA can actually do though. It is true that there are only so many people willing to make the changes necessary to move the Linux desktop experience forward. You have to be pretty thick skinned to do it since every attempt will be met with a ton of resistance from people who don't want anything to change from what they learned two or three decades ago. I mean you know how the vocal minority can be just from unpopular moves in GZDoom, scale that up to the Linux user base. 17 hours ago, Graf Zahl said: I think it has been conclusively proven by now that dynamic linking to libraries that need to be provided by external parties is not a viable mode of operation. As with all the other times we've had this discussion I disagree. There's definitely a middle ground between the two extremes that works perfectly fine. I agree with you when someone comes in insistent that libraries with no ABI stability policy (like gme) be system linked, but there are plenty of libraries which distros would only need to serve up a handful of copies of to maintain decades of app compatibility. I've seen multiple instances where changing out the libraries with newer ones fixes issues or adds new features for free without breaking apps. Things just aren't the same as they were in the 90s on both sides of the fence, but yes some people want to pick one ideology and apply it to everything. 0 Share this post Link to post
Graf Zahl Posted September 24, 2021 4 hours ago, Blzut3 said: I don't follow, all the components listed are definitely part of making the desktop behave the way that users expect it to. Of the three listed wayland is perhaps the most debatable, although it seems like since Ubuntu started switching we're seeing a lot more movement on fixing the remaining glaring issues so we'll see. Jury's still out but I had some pleasant surprises like the Loki port of Descent 3 working better under XWayland than on native X11 while I was checking it out. (Since that game is particularly quirky given that it was designed for CRTs and running in full screen.) I don't think we are talking about the same things here. While low level components are definitely needed for a better desktop experience, it does not address the core problem, namely that it is still an incoherent mix of independently developed stuff and not a serious, coherent desktop component. It's merely the driver on which a desktop can run. But in typical Linux fashion we got another external module that may or may not be present so apps may or may not depend on it and we're no better off. And on the GUI layer we got GTK+, Qt and FLTK, and each of these is visually different, and as a result we got nonsense requests for GUI apps to "please use FLTK instead of shitty GTK+" or "I object to Qt's terms of use so I refuse to use your app." If that is how things are done, we're still a long way off a working result. It doesn't work with the "my way or the highway" mentality that often runs rampant among the more hard code Linux people. 4 hours ago, Blzut3 said: It is true that there are only so many people willing to make the changes necessary to move the Linux desktop experience forward. You have to be pretty thick skinned to do it since every attempt will be met with a ton of resistance from people who don't want anything to change from what they learned two or three decades ago. I mean you know how the vocal minority can be just from unpopular moves in GZDoom, scale that up to the Linux user base. I can imagine, and yes, this is one of the biggest problems. 4 hours ago, Blzut3 said: As with all the other times we've had this discussion I disagree. There's definitely a middle ground between the two extremes that works perfectly fine. I agree with you when someone comes in insistent that libraries with no ABI stability policy (like gme) be system linked, but there are plenty of libraries which distros would only need to serve up a handful of copies of to maintain decades of app compatibility. I've seen multiple instances where changing out the libraries with newer ones fixes issues or adds new features for free without breaking apps. Things just aren't the same as they were in the 90s on both sides of the fence, but yes some people want to pick one ideology and apply it to everything. Yes, it can work, but the problem still remains: You'd be at the mercy of a third party not to break your app. If you did it the Windows/Mac way by providing the .so's in your application directory right next to the executable, you not only make yourself independent of that third party, but also keep the option to change out the libraries if the need arises. Of course that'd mean you can't copy everything to /usr/bin or whatever else you have to name your main directory for executables, it'd require leaving behind some of those arcane Unix conventions made up at a time when things were decidedly simpler, and that's probably yet another task that's orders of magnitude harder to accomplish. If we want Linux to become more popular it will have to do a few things the Windows way, or it'll go nowhere at all. 1 Share this post Link to post
dpJudas Posted September 24, 2021 (edited) 5 hours ago, Blzut3 said: It is true that there are only so many people willing to make the changes necessary to move the Linux desktop experience forward. You have to be pretty thick skinned to do it since every attempt will be met with a ton of resistance from people who don't want anything to change from what they learned two or three decades ago. In my opinion the big problem for the Linux desktop has been that the Linux kernel developers considered it "user space" and ignored the problem. Meanwhile in "user space" the X11 developers did nothing for 25 years, seemingly expecting it to be the job of KDE and GNOME. The end result is that if you want to do a simple Hello World UI application in Linux you have three options: 1) Use X11's original libraries. Almost untouched since the early 90's. It has one setting for the monitor DPI. 2) Use Qt and KDE. Codingwise the nicest option, but since Debian based distros generally gravitated towards GNOME that means apt-get install 25 packages or however many it suggests. Super crappy design. It also has its own setting for the monitor DPI. 3) Use Gtk+ and GNOME. Started out as a desperate attempt at stopping KDE over licensing and it shows. I have never seen a worse UI toolkit than this thing. Also, if you use a distro that gravitated towards KDE it means apt-get installs 25 different packages if you go this way. Super crappy design. Also, a third setting for the monitor DPI. See where I'm going here? There's just no way to target Desktop Linux as a whole with what they built on top of X11. The story about sound was just as horrible, though it seems nowadays libalsa2 "just works" for opening a sound device. So there's at least that progress. Meanwhile on macOS you just use UIKit and will be done with it. And what's your reward if you go through all this nonsense? 1% more marketshare. That's why only open source projects supports this platform. Edited September 24, 2021 by dpJudas 0 Share this post Link to post
Blzut3 Posted September 24, 2021 1 hour ago, Graf Zahl said: But in typical Linux fashion we got another external module that may or may not be present so apps may or may not depend on it and we're no better off. Windows IoT doesn't necessarily have all the components that your software expects and yet you don't consider it a problem. So why is it a problem that Linux also has some arcane setups? (In case you want to dismiss as "no one runs Windows IoT for desktop" consider stuff like the "Windows 9" or whatever projects which are based off these editions of Windows.) Or put another way, you constantly make arguments on what to support based on it being "single digit percentages of the user base" or something like that. Why is it with Linux you suddenly care about the 1% of users? 1 hour ago, Graf Zahl said: And on the GUI layer we got GTK+, Qt and FLTK, and each of these is visually different, and as a result we got nonsense requests for GUI apps to "please use FLTK instead of shitty GTK+" or "I object to Qt's terms of use so I refuse to use your app." Yes, that would be nonsense that can be ignored. While as a KDE user I have a slight preference towards Qt applications, at the end of the day it doesn't matter to me and again I can't imagine most users care. For whatever it's worth, I don't recall ever getting any weird request like this with Doomseeker. 1 hour ago, Graf Zahl said: Of course that'd mean you can't copy everything to /usr/bin or whatever else you have to name your main directory for executables, it'd require leaving behind some of those arcane Unix conventions made up at a time when things were decidedly simpler, and that's probably yet another task that's orders of magnitude harder to accomplish. This isn't mutually exclusive and is already solved. Set the RPATH/RUNPATH in your binary to /usr/lib/<package-name>/ and install your private shared object files there. Now I do agree that the unix file hierarchy could probably use a rethink, but the status quo includes support for this scenario. 1 hour ago, dpJudas said: See where I'm going here? There's just no way to target Desktop Linux as a whole with what they built on top of X11. Again, I don't know why people suddenly get concerned about the minority that cares when they talk about Linux. Use Qt or GTK+ (or other commonly available toolkit if you're so inclined I suppose), and don't worry about it. I would guess most people have both installed anyway. Probably more likely KDE users have GTK+ installed since both Chrome and Firefox depend on GTK+, but a lot easier to incidentally avoid Qt. (Although there are many popular apps that require Qt like OBS and VLC.) Can't comment on the DPI settings thing since I've only used Linux on standard PPI displays, KDE is usually pretty decent at bridging GTK+ apps so surprised that they wouldn't make the corresponding adjustment. In any case though support for high PPI displays is definitely something that Linux still needs to catch up on from what I hear. Now it's indeed a bit annoying to have this fragmentation for apps like GZDoom where it seems silly to pull in a whole toolkit as a dependency just to draw one or two simple dialogs. It's not like it would be the end of the world to require one or the other though. 1 hour ago, dpJudas said: And what's your reward if you go through all this nonsense? 1% more marketshare. No, it's a tiny fraction of that 1%, you get most of that 1% by just picking and ignoring the minority. From my point of view, if there's anything that makes supporting a wide array of distros and users difficult it would be the fragmentation on packaging formats. From a commercial software perspective there's not really a good solution around that besides stuff like flatpak/appimage. Although depending on what your software's audience is, distributing a deb and rpm is still often a viable solution to cover basically the whole market. 0 Share this post Link to post
Graf Zahl Posted September 24, 2021 6 minutes ago, Blzut3 said: Windows IoT doesn't necessarily have all the components that your software expects and yet you don't consider it a problem. So why is it a problem that Linux also has some arcane setups? (In case you want to dismiss as "no one runs Windows IoT for desktop" consider stuff like the "Windows 9" or whatever projects which are based off these editions of Windows.) Or put another way, you constantly make arguments on what to support based on it being "single digit percentages of the user base" or something like that. Why is it with Linux you suddenly care about the 1% of users? Do you want Microsoft to keep a quasi-monopoly on the desktop? The entire Windows 11 announcement tells me that we need a better option where a single manufacturer cannot simply deprecate half of the existing computers without caring about the consequences - but with how Linux works right now it's never going to cut it because it's an experts' system not one for the masses. So Apple is the only alternative, which will only make things worse. 6 minutes ago, Blzut3 said: Yes, that would be nonsense that can be ignored. While as a KDE user I have a slight preference towards Qt applications, at the end of the day it doesn't matter to me and again I can't imagine most users care. Considering that Windows users already get worked up about the differences between UWP, Win32 and .NET I don't think it cannot be ignored. This is constantly cited as macOS's strongest point (i.e. that it has a consistent UI where there's little deviation from the given standard.) 6 minutes ago, Blzut3 said: For whatever it's worth, I don't recall ever getting any weird request like this with Doomseeker. This isn't mutually exclusive and is already solved. Set the RPATH/RUNPATH in your binary to /usr/lib/<package-name>/ and install your private shared object files there. Sounds like mess to me. Sorry to disappoint you. What I'd really prefer is how macOS handled applications - you got a self contained folder below "Applications" and everything the application needs is in there, no references to non-system libraries in there. But even the proposed solutions like Flatpak suffer from being an external add-on, not part of the system. Obviously that all can never work as long as virtually everything aside from the kernel is considered "user-space" (i.e. 'we do not care') 1 Share this post Link to post
Cacodemon345 Posted September 24, 2021 (edited) GNUstep actually does feature a containerized app system that works like macOS but almost rest of the developers involved in the Linux ecosystem ignores it. I think the reason why Linux desktop still remains a joke is that they actively ignore the reasons that made macOS great. Edited September 24, 2021 by Cacodemon345 0 Share this post Link to post
ducon Posted September 24, 2021 11 minutes ago, Graf Zahl said: but with how Linux works right now it's never going to cut it because it's an experts' system not one for the masses. So Apple is the only alternative, which will only make things worse. When I have to use Windows, I don’t know how it works, what to do if I want to to this or that. It’s a mess and actually, if you want to do things in Windows (for example to fix it or even to kill a crashing application), you really need to be an expert. Not in Linux. 4 minutes ago, Cacodemon345 said: I think the reason why Linux desktop still remains a joke is that they actively ignore the reasons that made macOS great. Linux desktop is a joke because MacOS is closed source? 1 Share this post Link to post
Cacodemon345 Posted September 24, 2021 8 minutes ago, ducon said: Linux desktop is a joke because MacOS is closed source? Did macOS being closed source made it great? What? 0 Share this post Link to post
Recommended Posts