Blzut3 Posted May 6, 2023 (edited) 45 minutes ago, LexiMax said: Windows doesn't ship SDL. If you want to use SDL, you vendor it. Windows is set up for this to be the easy and expected way you handle your business on the platform. Of course it doesn't, you missed the point. If we step back and ignore that one of the features of SDL is to assist in writing cross platform applications (pretend for a moment that SDL only worked on Linux), then you could swap SDL out for Win32 and Direct X on Windows. (i.e. like GZDoom does.) Thus they are analogous even if SDL is a tiny subset of what the Windows APIs offer. If you're dynamically linking the system copy of SDL then you're treating it as a core part of the system (and thanks to SDL's ABI stability you can typically do so). Hence you're applying a double standard where you're OK with Microsoft not back porting new APIs to old versions of Windows but an analogous part of the Linux system must always have the latest APIs available to you. Of course you do have the option to static link/vendor it on Linux too if you must have the newer feature, and that's an option you don't have for the analogous part of Windows. Somehow having this flexibility on the developer's part is a negative for the platform? To flip it around it would be like me writing software that only targets Windows 11 and then complaining about stuff missing in Windows 10 when on Linux I can just vendor Wine with my application and get the latest APIs. (I know there have been efforts to get Wine working on Windows so it's actually less absurd of a comparison than you'd might think.) To me the primary difference between Windows and Linux in this context is that on Windows it's far easier to answer the question of "what is a system library?" Edit: Since you edited your post while I was replying. Quote Perhaps Linux distros shouldn't ship the entire open source ecosystem in its package manager if they can't fight their urge to tie dependencies together in a neat little bow. This I mostly agree with. I don't think distros shipping a whole ecosystem is inherently bad, but one of the small policy changes that I think could go a long way is if distros would define a small subset of libraries for which ABIs are guaranteed to be supported for say 10 years past when a new ABI gets defined (even if this means multiple versions of the library need to be maintained side by side). This subset would essentially be the SDK for the platform, and everything outside of this subset would be a separate repo providing desktop apps and convenience libraries for local development. Fairly analogous to macOS and how people use brew, but as a developer I know I can rely on stuff brew gives me. Edited May 6, 2023 by Blzut3 0 Quote Share this post Link to post
LexiMax Posted May 6, 2023 (edited) 1 hour ago, Blzut3 said: I don't think distros shipping a whole ecosystem is inherently bad I'm not really sure what you mean by "inherently" bad, but I feel very strongly about this, because to me, this is Linux's "original sin." It was probably initially done with good intentions, but has led to the status quo of today - where the #1 software distribution method on Linux is a distro repository, and if your needs fall outside that you are fighting an incredibly uphill battle, both against the ecosystem and the culture of Linux. I suppose what's most frustrating to me is the realization that it's not obvious that there's a problem if you haven't been in the sorts of situations where you run into its footguns. I ran Linux for quite some time as a daily driver in my webdev job - my tech stack was almost completely open source and it performed swimmingly. But when I switched industries and started shipping commercial video games on Windows and game consoles, I suddenly became very aware of how ill-suited Linux was for the requirements of that industry - closed source, distribute binaries, vendor the universe, and at some point having to call a game "done" and move to the next project. There's no technical reason why Linux couldn't be a good fit for that kind of development, but it's very much at odds with the repo model of distributing software that most of Linux userland seems to blissfully assume, and what ticks me off is that getting Linux users to even acknowledge that there's a problem worth solving is like pulling teeth. Edited May 6, 2023 by LexiMax 2 Quote Share this post Link to post
Blzut3 Posted May 6, 2023 51 minutes ago, LexiMax said: I'm not really sure what you mean by "inherently" bad What I mean is that while I certainly can see your argument that it helps perpetuate the culture (it's an easy thing to point at for new users to get excited over), it is not at the very least the only cause. We see similar setups being replicated to varying degrees for macOS and Windows (brew, chocolatey, msys2, win-get, even vcpkg), but they have different policies. The difference between the status quo and something that would be genuinely useful and supportive of third party software distribution isn't as large as it may seem. Many of the problems are even solved (or at least partially solved), but the features go unused because packaging policy says so. That and many of the people in charge are deliberately adversarial to proprietary software. 0 Quote Share this post Link to post
Graf Zahl Posted May 6, 2023 5 hours ago, Edward850 said: So little people use Windows 10 32bit right now that Steam doesn't even have a record for it in their hardware survey. 32bit makes up less than 0.1% of hardware right now. People have already moved on from it. 5 hours ago, Murdoch said: I won't be surprised if they are forced to extend support past 2025. Plenty of still viable gear can't run 11 and ms is living in lala land if they think people will just go out and buy new computers en masse. Agreed with both statements. The funny thing here is that, especially in the Open Source field, lots of software still ships 32 bit binaries in addition to 64 bit, despite all metrics around being unambiguously clear that there is no relevant market for it. A particularly interesting example is that Eternity Engine 4.02 not only had a regular 32 bit build but also a 'legacy' 32 bit build. I know it's over two years old by now, but considering that the hardware requiring this is probably 20 years old by now I'm wondering what prompted the decision to do such a thing. By all accounts there should be no relevant hardware around anymore that cannot run the regular one. 5 hours ago, LexiMax said: Windows doesn't ship SDL. If you want to use SDL, you vendor it. Windows is set up for this to be the easy and expected way you handle your business on the platform. This approach does have consequences - for example, I have 87 distinct copies of SDL2.dll on my system. But do I care? Not really - my programs still work. In the old days Windows also encouraged to install DLLs in the 'system' folder. The result was similar to what we get under Linux now: Dependency hell where applications overwrote each others' library. It was no surprise that vendors stopped doing it and instead put the DLLs side by side with their executables, because Windows has been set up from the start to first search there. 5 hours ago, LexiMax said: Perhaps Linux distros shouldn't ship the entire open source ecosystem in its package manager if they can't fight their urge to tie dependencies together in a neat little bow. If you ask me, the mess Linux has now is a direct result of unwillingness to part with old Unix conventions from times when applications were simple, mostly resource-less terminal applications. In such an environment putting all executable binaries into one folder actually made sense - but with increasing application complexity it became unsustainable. That problem never really got solved and is made worse by the default search path for SO's not including the actual binary's directory as the search default. All the infrastructure here seems to only have the one goal to keep the arcane Unix conventions working, but by now is stretching the system to its limit. The big problem with change here is simple: As long as the system works, it is very convenient. Loading a new library is dead simple so people rarely think about it being an issue - at least on Linux. Not so much on Windows and macOS where you need to ship all your dependencies. If you look at the binary distribution of the Windows version of software being developed on Linux, it is immediately recognizable by the large amount of accompanying DLLs. Since pulling in a library on Linux is so simple during development there's hardly any thought about bloat. Some stuff really has insane dependencies, because nobody ever thinks of it. Yes, the code is portable, but the price is high - even on Linux one must not forget that the more dependencies you got the more breaking points there are if something changes. 4 hours ago, Blzut3 said: Of course you do have the option to static link/vendor it on Linux too if you must have the newer feature, and that's an option you don't have for the analogous part of Windows. Somehow having this flexibility on the developer's part is a negative for the platform? SDL is not a system library, even when it can be treated like one. It's just a system abstraction wrapper which can play to entirely different rules. 4 hours ago, Blzut3 said: To flip it around it would be like me writing software that only targets Windows 11 and then complaining about stuff missing in Windows 10 when on Linux I can just vendor Wine with my application and get the latest APIs. (I know there have been efforts to get Wine working on Windows so it's actually less absurd of a comparison than you'd might think.) While this issue does exist, let's not forget that it is very comparable with having to deal with Linux LTS releases which normally are locked to specific library versions and rarely, if ever, update. So either way you have to contend with scenarios where you cannot access the latest and greatest featues - and from my personal experience the downsides on Linux tend to be far more severe in the long run. 4 hours ago, Blzut3 said: Edit: Since you edited your post while I was replying. This I mostly agree with. I don't think distros shipping a whole ecosystem is inherently bad, but one of the small policy changes that I think could go a long way is if distros would define a small subset of libraries for which ABIs are guaranteed to be supported for say 10 years past when a new ABI gets defined (even if this means multiple versions of the library need to be maintained side by side). This subset would essentially be the SDK for the platform, and everything outside of this subset would be a separate repo providing desktop apps and convenience libraries for local development. Fairly analogous to macOS and how people use brew, but as a developer I know I can rely on stuff brew gives me. That whole thing would only work if ALL distros could agree on a fixed set of libraries that are guaranteed to be present with a well-defined set of supported features. But some libraries go out of their way to allow configurability that helps people shoot themselves in the foot. FreeType is the most blatant example I know. It has so many compile time options that I couldn't ever rely on a system provided variant. But knowing Linux, such an agreement will never happen. 4 Quote Share this post Link to post
yum13241 Posted May 6, 2023 Quote While this issue does exist, let's not forget that it is very comparable with having to deal with Linux LTS releases which normally are locked to specific library versions and rarely, if ever, update. That's an LTS problem. LTS was never a good idea. Even Windows is less dumb. Quote Dependency hell where applications overwrote each others' library. Funnily enough, I don't have library issues yet. I wrote this post on Linux BTW. If you rely on a modified version of your library, make it an official fork and change the library name. That way, dependency hell doesn't happen as much. If you rely on a specific version, add the version number to the library name. That way, people/package managers won't blindly update the library under your nose. SDL2.dll (SDL2.so) means "I use the latest version of SDL2, I can deal with newer versions". SDL2-2.0.6.dll (SDL2-2.0.6.so) means "I use SDL2 2.0.6, I need this to function". SDL2-MyFork-1.0.dll (SDL2-MyFork-1.0.so) means "I use a fork of SDL, I need this to function". 0 Quote Share this post Link to post
Graf Zahl Posted May 6, 2023 30 minutes ago, yum13241 said: If you rely on a modified version of your library, make it an official fork and change the library name. That way, dependency hell doesn't happen as much. If you rely on a specific version, add the version number to the library name. That way, people/package managers won't blindly update the library under your nose. SDL2.dll (SDL2.so) means "I use the latest version of SDL2, I can deal with newer versions". SDL2-2.0.6.dll (SDL2-2.0.6.so) means "I use SDL2 2.0.6, I need this to function". SDL2-MyFork-1.0.dll (SDL2-MyFork-1.0.so) means "I use a fork of SDL, I need this to function". Asking to fork the library to make the application work is a nonsensical approach. Nobody will ever do that. What vendors want is a universally accepted way to bundle their software without having to enter the realm of potential dependency conflicts. Windows has that, macOS has that. On Linux there's options but they are being met with far too much resistance by those deeply entrenched in the current system. Well, to be honest, while the attempted solutions point into the right direction they still suffer from having to work within the confines of "How Unix does things" which makes them slightly suboptimal. We are fully in the realm of unsustainability here. This idealistic approach can never work for most commercial software. As a commercial vendor you need to give guarantees. That guarantee cannot be given if you do not provide your own copies of the libraries, since you want to link to the specific version you tested against, not some clone of it built by other people who may also make mistakes or compile with specific options that are unusual. Or take a security critical system. On that you NEED the guarantee that nothing changes the system config. Hence LTS. Just too bad that it bleeds into user space in an utterly disruptive way and the mere existence of these systems is a major roadblock for developers. Just look at Steam: Aside from SteamOS the most popular distro is Ubuntu LTS! So people can tell 100 times that LTS is a bad idea - it's still what us developers have to consider with every single decision we make. And you see the result: When it comes to mainstream commercial software, Linux is a barren wasteland. The effort to make things work stands in no relation to the potential benefit. 4 Quote Share this post Link to post
ChopBlock223 Posted May 6, 2023 I resent having to fix up Windows 10 so damn much out of the box. All the new bloatware, the new interface inexplicably modeled after phones and iPads, and all the shit you used to be able to tweak or disable with the control panel, but which now can only be altered by futzing with the registry. I dread to imagine how the experience is like for people who aren't tech savvy like me (as in, below my meager skills and knowledge). Windows 7 may be very dated now, but it never gave me even a fraction of this friction, it was slick and easy going. 3 hours ago, Graf Zahl said: When it comes to mainstream commercial software, Linux is a barren wasteland. At least it's open source though! 6 Quote Share this post Link to post
Blzut3 Posted May 6, 2023 12 hours ago, Graf Zahl said: That whole thing would only work if ALL distros could agree on a fixed set of libraries that are guaranteed to be present with a well-defined set of supported features There's enough there for which there's only one answer (or alternatives are drop in compatible) that the common subset would naturally fall out. The point of having a long term maintenance guarantee is that the kitchen sink can't just be thrown in, so it would be a very restricted set of libraries. Again, not unlike macOS. There is also the fact that you only need Ubuntu or maybe RHEL to do this as all the other distros would provide a compatible runtime. Despite all the bark, users would want to run software that would be developed against it. It's kind of similar to what Valve is doing with the Steam Runtime. 12 hours ago, Graf Zahl said: FreeType is the most blatant example I know. It has so many compile time options that I couldn't ever rely on a system provided variant. That's a perfectly valid reason to not rely on the system copy. I'm not familiar with FreeType to comment on why it has a ton of optional features, but it definitely sounds like the type of library that's of no consequence to vendor. SDL also has a ton of optional components, but in that case it's because it needs low level libraries to be available to build various alternative abstractions. When new subsystems come out, old applications can have native support for them by simply changing the SDL library (which before you point out won't happen on an LTS, it does happen when you upgrade the distro, which is also when new subsystems tend to be introduced anyway). Granted compatibility layers are available so it's not strictly necessary to do, but it's usually a better experience not going through them. 11 hours ago, yum13241 said: That's an LTS problem. LTS was never a good idea. Even Windows is less dumb. The LTS model in Linux is essentially the same as how Windows works. (Although now that Windows 10 and 11 are the only versions supported for normal users it's a little harder to see now, but LTSC is a thing. Still the normal release channel is not unlike Ubuntu's 6 month releases.) Want a concrete example? Look at SChannel, Windows's SSL/TLS library. Want TLS 1.3 support? Upgrade to latest Windows. This is particularly bad since the alternative is to ship an alternative SSL implementation with the app, which 1) requires paperwork to get an export license if you're doing everything by the books, 2) these libraries are frequent targets of attacks so it's a lot of maintenance burden (particularly in the enterprise space where your customers are running vulnerability scanners and freak out even if a CVE isn't relevant to your particular app). 0 Quote Share this post Link to post
LexiMax Posted May 6, 2023 18 hours ago, Blzut3 said: We see similar setups being replicated to varying degrees for macOS and Windows (brew, chocolatey, msys2, win-get, even vcpkg), I actually use chocolatey and vcpkg. Chocolatey wraps traditional installers and zip files - it's purely a convenience for keeping software up to date, and doesn't try to tie its entire universe together with shared library dependencies to nearly the same degree that Linux distros attempt to. vcpkg on the other hand is actually amazing, because it takes a lot of the headache out of vendoring library dependencies. I control the state of my vcpkg submodule, as well as my vcpkg.json, and because of that I as the project author have ultimate control over the libraries. 0 Quote Share this post Link to post
Graf Zahl Posted May 7, 2023 6 hours ago, LexiMax said: vcpkg on the other hand is actually amazing, because it takes a lot of the headache out of vendoring library dependencies. I control the state of my vcpkg submodule, as well as my vcpkg.json, and because of that I as the project author have ultimate control over the libraries. I just wish that they finally provide better integration between CMake and vcpkg. If these two could naturally work together on their own it'd make using third party libraries magnitudes more convenient for cross platform projects. 0 Quote Share this post Link to post
LexiMax Posted May 7, 2023 11 hours ago, Graf Zahl said: I just wish that they finally provide better integration between CMake and vcpkg. If these two could naturally work together on their own it'd make using third party libraries magnitudes more convenient for cross platform projects. What do you mean, better? I tried vcpkg using manifest mode and found it very convenient using the vcpkg toolchain file in my CMakeLists. 0 Quote Share this post Link to post
Graf Zahl Posted May 7, 2023 I mean native integration. The toolchain file is fine, but it would be better if it was fully automated. 0 Quote Share this post Link to post
Azuris Posted May 7, 2023 On 5/6/2023 at 12:30 PM, ChopBlock223 said: I resent having to fix up Windows 10 so damn much out of the box. All the new bloatware, the new interface inexplicably modeled after phones and iPads, and all the shit you used to be able to tweak or disable with the control panel, but which now can only be altered by futzing with the registry. Thats what brought me to test out Linux Mint on a "Media PC" i bought for the Living Room. If i have to start to search all kind of Webpages to tame the OS, be it Win 10 or 11, than why shouldn't i go for Linux and tame that. Only Thing i had to search around was the Blueetooth Dongle i bought, rest was mostly out of the Box and nothing was harder than installing something like Direct X additional back in the Days. I did really love and prefare Windows and how its Logic is, but they accomplished to annoy me too much, Lutris, Proton and Playonlinux make it easy to change. Microsoft is marching more and more towards the Cloud and their Store, that will hinder you to accsess the App Folder. They're seemingly working on an own ARM CPU and i bet they secretly would love to kill off Win32. 0 Quote Share this post Link to post
Graf Zahl Posted May 7, 2023 52 minutes ago, Azuris said: They're seemingly working on an own ARM CPU and i bet they secretly would love to kill off Win32. I don't think you have any idea who Microsoft's main customers are. No, not you, not me and not anybody else on this forum. Their biggest customers are large companies who pay premium price for their services. And since most of these run Windows their in-house software is based on Win32. Now take one guesswhat will happen if they take that away. They'd eliminate their entire customer base all at once because why pay Microsoft for services if they take away the foundation for your entire infrastructure? 6 Quote Share this post Link to post
Professor Hastig Posted May 8, 2023 (edited) 11 hours ago, Azuris said: Microsoft is marching more and more towards the Cloud and their Store, that will hinder you to accsess the App Folder. The cloud is definitely gaining more and more importance. They do that because their customers want that. As I am working for such a customer I can tell you that there's a lot of money on the table here. Yes, they also try to limit access to the app folder. Why? For security reasons, of course! Users messing around with this are one of the biggest causes of system stability. Especially in an enterprise environment this can be a major problem if clueless employees try to outsmart the system. Let's not forget here that it is even more locked down on macOS. There the entire application bundle is signed so its containing folder is entirely off limits. Quote They're seemingly working on an own ARM CPU Why not? ARM with its low energy footprint will only gain importance as time goes on. Apple has already proven that it is viable. Quote and i bet they secretly would love to kill off Win32. Seconding Graf Zahl's response, they cannot afford to do that, plain and simple. Win32 is a critical infrastructure for the world wide economy. They not just wouldn't get the wrath of their entire customer base - from big, often globally oriented mega-corporations to any kind of smaller outfit - they'd also stir up the politicians of the entire world whose national economy would be impacted. Edited May 8, 2023 by Professor Hastig 1 Quote Share this post Link to post
Edward850 Posted May 8, 2023 (edited) 11 hours ago, Azuris said: They're seemingly working on an own ARM CPU and i bet they secretly would love to kill off Win32. Microsoft already have a version of Windows that doesn't have Win32, it's used on the Xbox. Seeing as they've had it since Windows 8, and Win32 compatibility has actually been reinforced in the Windows Store (it's notably how games/the GDK works now, scrapping the old XDK/UWP standard, and you can also now use Win32 for apps), it's incredibly certain that Win32 support is going to stick around for a long time to come, or they'd actually have already gotten rid of it when they had the opportunity to. Edited May 8, 2023 by Edward850 4 Quote Share this post Link to post
Kinsie Posted May 8, 2023 11 hours ago, Azuris said: Microsoft is marching more and more towards the Cloud and their Store, that will hinder you to accsess the App Folder. 4 minutes ago, Professor Hastig said: Yes, they also try to limit access to the app folder. Why? For security reasons, of course! Users messing around with this are one of the biggest causes of system stability. Especially in an enterprise environment this can be a major problem if clueless employees try to outsmart the system. Let's not forget here that it is even more locked down on macOS. There the entire application bundle is signed so its containing folder is entirely off limits. They already tried this with the Windows Store and UWP, and it was so aggressively unsuccessful they had to sheepishly backport certain enhancements back to Win32 and let Win32 software use the store. More recently, they've even walked back restricting the app folder, with the "Advanced Installation" option available for certain games. 5 Quote Share this post Link to post
yum13241 Posted May 8, 2023 > The LTS model in Linux is essentially the same as how Windows works. (Although now that Windows 10 and 11 are the only versions supported for normal users it's a little harder to see now, but LTSC is a thing. Still the normal release channel is not unlike Ubuntu's 6 month releases.) No, Windows and its apps update independently. In LTS all your apps are bogged down by the kernel not updating. > Asking to fork the library to make the application work is a nonsensical approach. By that logic, if I made a fork of GZDoom that is semi-backwards compatible I should call it GZDoom 5, causing zillions to launch my custom mod for it with GZDoom 4, thinking it should work, or vice versa with a different mod. 0 Quote Share this post Link to post
Professor Hastig Posted May 8, 2023 55 minutes ago, yum13241 said: > Asking to fork the library to make the application work is a nonsensical approach. By that logic, if I made a fork of GZDoom that is semi-backwards compatible I should call it GZDoom 5, causing zillions to launch my custom mod for it with GZDoom 4, thinking it should work, or vice versa with a different mod. GZDoom is not a library but an application. Don't compare apples with oranges. 0 Quote Share this post Link to post
yum13241 Posted May 8, 2023 (edited) Then replace GZDoom with ZMusic and then it'll make sense. By that logic, if I made a fork of ZMusic that is semi-backwards compatible I should call it ZMusic 2, causing zillions to build my custom fork with vanilla GZDoom, (it was meant for a fork) thinking it should work, or vice versa with a different port. (This imaginary port would be some random fork that I'd make) You're either: 1. Completely backwards-compatible. 2. Completely backwards-incompatible. (even if artificially bombing out with an error) 3. Going to document what is and isn't backwards compatible. Anything else does not work out. GZDoom is also a game engine. I can't expect someone making a commercial mod for GZDoom for it to be compatible with stock GZDoom. (It's nice if it does, but they may have added additional features to their fork) Even if it loads, it might work improperly, due to them using a fork that has modified/changed features. In that case, it should bomb out with an error, since it's not exactly a library. Linux already does that when something soname bumps (it bombs out with an error). soname bumps happen when backwards-compatibility is not in the new version. Symlinking will not work nicely, and the app should update, and/or the distro should provide the older version. Edited May 8, 2023 by yum13241 0 Quote Share this post Link to post
Graf Zahl Posted May 8, 2023 Well, that's lots of talk missing the forest for the trees. Yes, you can fork the library. But then you got a new package. I already experienced with ZMusic how that goes with Linux users - you may guess it: If it's not in the distro's repo it's suspicious by default - some people cannot wrap their head around getting such software installed etc., etc. Whereas a proper distribution format would eliminate all the issues at the expense of having a few more files on the hard drive. No need to document anything, no need to consider version incompatibilities and no need for the customer to think about these kinds of intricacies. 0 Quote Share this post Link to post
Blzut3 Posted May 9, 2023 17 hours ago, yum13241 said: No, Windows and its apps update independently. In LTS all your apps are bogged down by the kernel not updating. By the kernel not updating? While I'm sure there are exceptions, the kernel being old is not usually what people complain about (unless we're talking new hardware). Windows also doesn't update their kernel (and equivalent things) in a release cycle. They even distribute stuff like curl and then don't update it in a timely manner. As for apps updating independently, for the applications I care about being updated (i.e. the stuff I use daily) I subscribe to third party repos that update them (or download standalone deb packages if it's not that important). Yes the powers that be don't recommend this for very well thought out reasons, but that just circles all the way back to the irony of the model having a lot of similarities to the walled gardens. One might wonder why I don't just use a rolling distro: I like being able to just set updates to run automatically and not have to worry about them breaking anything. Been years and have yet to be burned by a bad update. On 5/6/2023 at 2:16 AM, yum13241 said: If you rely on a modified version of your library, make it an official fork and change the library name. That way, dependency hell doesn't happen as much. If you rely on a specific version, add the version number to the library name. That way, people/package managers won't blindly update the library under your nose. Changing the name isn't enough to satisfy the most strict distributions (looking at Debian). They have tools to look for similar code to catch attempts to hide embedded libraries. This behavior is the crux of the problem that Graf has since if the application does have a specific reason to not be using the mainline (or distro configured) version you have a no win situation where the Debian maintainer will be forced by policy to cripple your app if the changes can't go upstream. If my memory serves this caused problems when libav forked from ffmpeg for example. I absolutely do see both sides to the argument. The Debian maintainers aren't wrong that it would be better for everyone if downstream developers worked with upstream to come to a solution that works for everyone. It also places a maintenance burden since security vulnerabilities need to be address for every fork. However, sometimes you just don't have the luxury of the time or will power to make a proper solution. Then there's the issue if the customizations are specific to the particular program, in which case separating out into an individual package isn't helpful in any way. To me delivering the proper user experience is more important than purity. I don't particularly care about getting my software into the distro repositories. When I build my packages, I do try to follow the guidelines as close as make sense though. (If anyone picks apart the debs I currently maintain, do note that many were built following Ubuntu's old third party software distribution guidelines which have long since been retracted (probably because of snap). Just keep building them that way since there hasn't been a particular reason to move things around on users.) 0 Quote Share this post Link to post
Amarande Posted May 9, 2023 On 5/4/2023 at 2:01 AM, Blzut3 said: People will find a conspiracy in everything won't they? Having Adobe dictate what platforms can access content was clearly giving big tech less control than adopting open standards that made using Linux, BSD, what have you more viable! (Not to mention flash dying was also somewhat key to making ARM laptops/desktops viable.) Sure it was Apple who forced the move away from Flash by intentionally blocking it on iOS, but Flash's overuse was huge problem for alternative platforms in the day. Not to mention a security issue in general, as was Java - Flash IIRC actually got a slight renaissance in its dying years even despite the Apple action, which I seem to recall actually got a fair amount of negative reception, because browser makers got together to kick Java to the curb (to the point of killing off old style Plugins altogether to do it; whether that's a good idea was debatable, especially since browser makers seem to be trying to kneecap addons in general - presumably because of Microsoft and Google's reliance on advertising and the fact that "addons" these days mostly seems to mean "uBO" ... LOL). The only sad thing about it is that Java and Flash both had to die for the same reason: you just can't have a language that can do full-featured applications and also use it as a Web applet language and expect not to have the galloping security disaster both chronically experienced (I mean, surprise pikachu face time here really ... you have a language, that has full access to the local system context, and also expects to be able to run stuff off of web pages that you don't have control over! Might as well just have browsers run embedded .EXE files on webpages!). That is indeed sad, though, because it suggests that client-side webapps are actually impossible; they HAVE to be server side, and thus further empower the Blankety-as-a-Service mentality that pushes everything into The Cloud and takes away our autonomy and puts Big Tech in control every time, all the time :( 1 Quote Share this post Link to post
Blzut3 Posted May 9, 2023 2 hours ago, Amarande said: That is indeed sad, though, because it suggests that client-side webapps are actually impossible; they HAVE to be server side, and thus further empower the Blankety-as-a-Service mentality that pushes everything into The Cloud and takes away our autonomy and puts Big Tech in control every time, all the time :( JavaScript has had for awhile now the ability to do everything you could have wanted a Java or Flash program to do. This includes local file I/O and sockets (to some extent). The timing correlation is coincidence. 1 Quote Share this post Link to post
yum13241 Posted May 9, 2023 > By the kernel not updating? I meant that because the kernel updates less often, they double down and ship ancient userspace tools too. > One might wonder why I don't just use a rolling distro: I like being able to just set updates to run automatically and not have to worry about them breaking anything. Been years and have yet to be burned by a bad update. I've used a rolling distro for years and I haven't been burned by a bad update yet. Random Text formatting rant. Spoiler Why does Doomworld REQUIRE you to use their toolbar to format text? At least BBCode let me keep my hands on my keyboard. Having to switch to the mouse is annoying. 0 Quote Share this post Link to post
Edward850 Posted May 9, 2023 (edited) 9 minutes ago, yum13241 said: Why does Doomworld REQUIRE you to use their toolbar to format text? At least BBCode let me keep my hands on my keyboard. Having to switch to the mouse is annoying. That's just how the forum software works, it's built around more general devices and rich text formatting meaning formatting can be copied between programs. Likewise, if you remember your schooling, formatting hotkeys work like they do in Microsoft Word so your hands never have to leave the keyboard. You can also still use bbcode manually if you remember the syntax, and the software will automatically convert it to rich text when you submit. Edited May 9, 2023 by Edward850 1 Quote Share this post Link to post
yum13241 Posted May 9, 2023 (edited) Quote You can also still use bbcode manually if you remember the syntax, and the software will automatically convert it to rich text when you submit. doesn't work. Quote Likewise, if you remember your schooling, formatting hotkeys work like they do in Microsoft Word so your hands never have to leave the keyboard. Good to know. Remember: Shit like this is why you don't buy games on a DRMed store. Your games get [c]Sprite TNT1 Frame A Rotation 0[/c]ed. GOG ftw. HOW DO I FIX THE GODDAMN BROKEN LINK BY A ROGUE URL BBCODE? Edited May 9, 2023 by yum13241 0 Quote Share this post Link to post
Gez Posted May 9, 2023 26 minutes ago, yum13241 said: HOW DO I FIX THE GODDAMN BROKEN LINK BY A ROGUE URL BBCODE? The Tx button turns everything selected into plain text. Note that there's no subscript button, so I had to the sub bbcode tag manually to replicate the look of this button. 1 Quote Share this post Link to post
Blzut3 Posted May 10, 2023 12 hours ago, yum13241 said: years and have yet to be burned by a bad update. I've used a rolling distro for years and I haven't been burned by a bad update yet. Not sure what distro you use, but I do know Arch posts notices every now and then that updates may require additional care. I'm not sure I'd trust doing unattended/automatic upgrades on Arch, but maybe other rolling distros do better? Although it would be a fairly minor issue considering the communities around rolling distros tend to be more open to side by side library installations (somewhat ironically), I also would expect the proprietary software I use to randomly break based on arbitrary release cycles. At least with Ubuntu I know exactly when it will happen (although often apt will keep the old package around anyway for various reasons) and can be prepared to reinstall any old packages I still need. 12 hours ago, yum13241 said: I meant that because the kernel updates less often, they double down and ship ancient userspace tools too. I don't think there's any causation here. As I said, pretty much all software can be compiled to run on quite old kernels. RHEL has recently started dipping their toes into shipping more recent software with AppStream. On the flip side Ubuntu has their HWE kernels/graphics stack. They just want to blanket apply one policy to the whole world because it's simpler for them that way. 0 Quote Share this post Link to post
dasho Posted May 10, 2023 (edited) 5 minutes ago, Blzut3 said: Not sure what distro you use, but I do know Arch posts notices every now and then that updates may require additional care. I'm not sure I'd trust doing unattended/automatic upgrades on Arch, but maybe other rolling distros do better? For me, openSUSE Tumbleweed is the gold standard for a rolling release daily driver (i.e., just update whenever updates come along). Edited May 10, 2023 by dasho 0 Quote Share this post Link to post
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.