Jump to content

Graf Zahl derails another thread by making up an excuse to troll Linux users, episode #875,482,348,234


Graf Zahl

Recommended Posts

10 hours ago, AlexMax said:

It seems like only GNU/Linux distributions insist on having a single shared set of libraries as the happy path, and there's nothing either programmatic or idealogical that would prevent from moving towards a more sane software distribution system....except folks who insist that there's nothing wrong with the way things are despite mountains of evidence to the contrary.

 

GNU/Linux is unique in a lot of ways, but those ways are also the reason it has such a big userbase and is also the only open-source OS to pose a considerable threat to proprietary OSes; no other OS even appears in market share graphs with Windows and Mac OS to begin with. So it definitely did something right, compared to, say, FreeBSD.

 

That said, I'm not that big a fan of copyleft, and I already stated my reasons before.

 

As for the library path, that's actually because that's the conventional place to put libraries since Unix and it reduces library redundancy. The only pain with redistribution, which is the library differences across distros, is gone if you simply release the source. It's not like your average Joe gamer is going to use Linux anyway, and we don't expect any such person to.

 

Also, once you release the source code, it's up to distros to get the binaries working. Don't bother trying to package your program prebuilt for Linux. Just don't. Please. It's a headache because it's not what you're supposed to do. Either distros do that, or the user does. And if a distro is incompetent, let it be. There is a shit ton of Linux distros out there and you can't speak for all of them with sweeping statements like you do all the time. If it's a pain to build something for Linux, then let the other distros bear that pain. Linux does not need any such prospects of "unification".

 

8 hours ago, AlexMax said:

Even when I attempt to appeal to people's selfish desire for the continued well being of their environment, I'm told that they would rather it be fucked over by Microsoft as long as it was an example for posterity.

 

That last part is incredibly troubling.  What kind of supposed Linux advocate would say such a thing?


It's not selfish. Quite the opposite – they want you to forfeit the responsibility of shipping the software, because that is a responsibility of a Linux distribution. Allowing third parties to ship prebuilt packages is very risky to Linux for a multitude of reasons:

  • It can allow the introduction of malware from untrusted or un-PGP-verified third parties, which Linux has no defense against as it has never needed such defense (no such thing as an open-source virus as there are eyes peering into the source code at all times);
  • and also makes it very easy to install closed-source software; this undermines free software, as people are demotivated to contribute to the source code of the software they have. People getting arsed to contribute is the only reason free software is such a large ecosystem. If people sit back and start giving fuck all, free software will stagnate, and in a rapidly changing and evolving world, stagnating means staying behind and bleeding out your userbase till you die.

 

This is why Linux is expressly against third parties having the liberty of software distribution; it is dangerous to users and dangerous to free software.

 

Are you starting to see now why maintaining a software ecosystem like GNU/Linux is an inherently political affair?

 

Distro maintainers are generally well-respected people who are entrusted to maintain repositories, verify packages for vulnerabilities, and submit audited updates every now and then. Even Arch Linux, which is a rolling-release distribution known for having a very fast repository update cycle, which means up-to-date packages in its repositories (you'd like it!), has constant auditing and always provides immediate repository updates when a security vulnerability or zero-day exploit is patched. This is necessary.

 

Also, they often apply patches to the software if it is necessary to make it compatible with the distribution. Again, it should be the distribution's responsibility to provide prebuilt software, because thanks to Linux's emphasis on freedom and flexibility, only developers of the distro itself know best to make software suitable for it. And, especially in a sea with such a massive multitude of distros, it would be a Sisyphean task to try to develop for all of them. You, as a software developer, must not bother. Just develop. :)

 

If a neutral third party were to read the entirety of this thread from start to finish, they'd probably start liking Linux... a lot. Then again, I don't know what poor insane lost soul would read this thread in its entirety. Oh wait, nevermind, we hit the lost soul limit already, didn't we.

 

7 hours ago, Graf Zahl said:

That'd be the kind of people for which everything is black and white. And since the package management is great, anything to undermine it is bad, therefore self-contained apps are an Evil to avoid.

 

Nothing is black and white. Except boolean values, but the world - even software development, especially its political aspects - is not made of bools. :)

 

It's not that having third-party installs would undermine package management. It doesn't, and it is exactly why /opt exists - it's where self-contained apps are usually installed in order to not interfere with /usr.

 

It's both a security concern and a political concern. And by politics, I don't mean "inconsequential things outside the realm of software". Maintaining a software ecosystem is inherently political, especially if you're a Linux distribution. I've said this a great many number of times and it seems to go in ONE ear, and out the OTHER.

 

Why do you think Richard Stallman calls his very own conception, the free software, a political movement? He's comparable to Karl Marx in this regard, which includes both the snob (of thinking you have everything figured out) and the intelligence. (And the controversy, though in this case it's nothing to do with free software.)

 

7 hours ago, Graf Zahl said:

There's also this strange belief that if there's a new version of some library, it's a great feature that all apps will benefit from it when it receives an update. But in reality it's just one of those things where the lofty ideals clash with the darker side of software development, but this is a concept that just gets persistently ignored because it's a gray area.

 

But it is! I mean, it's the only way to get patched against the most recent zero-day exploits; oftentimes, cyberattacks spring in the very same day the 0-day is announced publicly and patches even provided to begin with.

 

(If I recall correctly, the cybersecurity institutions that announce those exploits usually provide patches that fix them in the same announcement, right?)

 

It's not a "strange belief". I wish you gave those people and their blogs genuine, impartial reads, and were more open-minded to that. Instead of selfishly writing them off as selfish. Huh.

 

-------

 

Yes, you may have realized there are some things I'm saying again and again, some also in bold. Those emphases are very important points I'd love to see addressed. Sadly, all I see is the rhetoric tactic of dodging. Narcissism can walk quite the lengths in order to maintain its shield.

Edited by Gustavo6046

Share this post


Link to post

I was going to do a bigger response, but I saw this:

 

1 hour ago, Gustavo6046 said:

Sadly, all I see is the rhetoric tactic of dodging.

 

That's enough.

 

This isn't debate club.  You are not owed a point-by-point rebuttal, and I'm not here to debate the merits of the GNU/Linux ecosystems, partly because I've held and believed many of your positions in the past and partly because I'm too busy trying to warn you that the cathedral you hold so dear is on fire.

 

The enemies of GNU/Linux are on your doorstep and you don't seem to realize or appreciate it.

Edited by AlexMax

Share this post


Link to post
1 hour ago, Gustavo6046 said:

 

But it is! I mean, it's the only way to get patched against the most recent zero-day exploits; oftentimes, cyberattacks spring in the very same day the 0-day is announced publicly and patches even provided to begin with.

 

 

(If I recall correctly, the cybersecurity institutions that announce those exploits usually provide patches that fix them in the same announcement, right?)

 

It's not a "strange belief". I wish you gave those people and their blogs genuine, impartial reads, and were more open-minded to that. Instead of selfishly writing them off as selfish. Huh.

 

The fallacy here is to think that every single user of a library can deal with the update and that each update is rigorously tested for side effects. And even then there's no guarantee because the consumer of the library may inadvertently depend on what just got 'fixed'. Not even Windows and macOS get every update right and they get developed by professional teams as homogenous products. And now consider the ramifications of this in an ecosystem consisting of hunderds of libraries developed by hundreds of independent teams. I could never properly test my app on something this volatile!

 

What this does in reality is creating an untested package. Most software is not written in a logical fashion. It contains bugs and unintended side effects and every swapped library can bring the house down.

 

And here's the real issue for a developer: If stuff breaks it won't be the library developers that get the brunt of complaints but the developer of the application, who most likely is the one person who cannot do anything about the problem!

 

GZDoom ran into this as well recently with Ubuntu shipping a broken version of Fluidsynth. We are in no position to fix the problem because the only way around it would be to include our own copy - which is - guess what - not a well supported scenario, made worse by a nightmare chain of dependencies.

THe Flatpak version some guy created works fine, btw. The reason should be obvious...

 

So your lofty ideal of always patched software ultimately backfires, resulting in unstable apps, frustrated developers and as a worst case scenario no app at all because its developers call it quits.

 

 

Edited by Graf Zahl

Share this post


Link to post
1 hour ago, Graf Zahl said:

So your lofty ideal of always patched software ultimately backfires, resulting in unstable apps, frustrated developers and as a worst case scenario no app at all because its developers call it quits.

 

The thing is, at this point I think that trying to appeal to empathy towards third party developers is a lost cause.

 

If you read Gustavo's most recent post towards the top he tips his hand in that he believes that forcing developers to release their source and forcing distros to package said source is necessary to preserve the free software ecosystem.  That really clued me in on where his priorities lie.  To make it more convenient for folks to package software for Linux means throwing open the gates to the closed source barbarians.  Therefore, the current system has to stay in place.

 

Of course, what he doesn't realize is that not only have large corporations been subverting Linux for as long as I can remember (NVIDIA graphics driver anyone?), but the nature of the threat has changed and their gatekeeping is no longer effective at preserving their free software utopia, with WSL in the picture.  Plus AppImage, Snap and Flatpak - although they aren't the happy path, they at least exist as a way to subvert the ideal.

Edited by AlexMax

Share this post


Link to post
15 hours ago, CBM said:

and some of them even have baffling design choices.. like android running everything with a type of JVM, adding unneeded complexity and slowdowns

 

15 hours ago, Graf Zahl said:

So even with its insane Java API [Android] was still the best choice.

 

 

I believe the rationale behind having a JVM or a similar bytecode/VM technology on mobile devices was the appeal of "write once, run anywhere" -in theory, apps should remain device- and CPU architecture- independent forever, even though we all know that this way of doing things quickly resulted in performance bottlenecks and also made porting existing code harder, thus resulting in device-specific precompiled  libraries soon enough -otherwise we wouldn't have e.g. Doom source ports on Android, at least not before Mocha Doom ;-).

 

This approach worked OK while everyone was still using 32-bit ARM, but predictably  broke many older unsupported apps coded "in the old-fashioned way" down the line. Today it's considered normal (and expected) for "serious" performance apps to be coded in native code and carry device-specific optimizations, so nobody thinks much about it, nor expects everything to be coded in 100% pure, vanilla, portable Android Java that runs on anything from the first Android phone ever to the latest modern one.

 

Microsoft obviously pushed their own CLR and .NET/C# toolchain for similar reasons, while Apple was the oddball standout once again, in that they imposed the use of their own Cocoa API but with an Objective-C core language, aka not an interpreted/bytecode/JIT-compiled one. Then again they hated any kind of scripting in their apps and were (and are still..?) very anal about performance, plus they controlled the hardware much more closely than the competition.

Share this post


Link to post

It is pretty clear Gustavo has never run an open source project for Linux. The users of Linux does not go to their Linux distro if your package doesn't build on theirs. They go complain directly to the project. Not only that, the distro packagers, also complain directly to the project whenever it doesn't do things the way that distro wants it to.

 

Two simple examples of this is that when the GZDoom project upgraded its C++ compiler requirements the users complained on the ZDoom forums that it wouldn't build. And when GZDoom statically linked the GLSL compiler the distro packagers wanted GZDoom to adjust to their way of doing things.

Share this post


Link to post
6 hours ago, Gustavo6046 said:

But it is! I mean, it's the only way to get patched against the most recent zero-day exploits; oftentimes, cyberattacks spring in the very same day the 0-day is announced publicly and patches even provided to begin with.

This assumes that the only content of updates is security fixes, while retaining complete API and ABI compatibility.

 

Certainly we've never seen in the history of software that a library update would do things such as, for example, change the order of arguments in a function call, so that pitch and pan are swapped around, breaking the sound code.

Share this post


Link to post
4 hours ago, dpJudas said:

It is pretty clear Gustavo has never run an open source project for Linux. The users of Linux does not go to their Linux distro if your package doesn't build on theirs. They go complain directly to the project. Not only that, the distro packagers, also complain directly to the project whenever it doesn't do things the way that distro wants it to.

 

I'm guilty of this myself.  I've actually gone to complain about a GNOME Wayland screenshot crash directly to the GNOME help channel, only to realize that the fix had been in GNOME upstream for quite a while and Ubuntu simply wasn't up to date and hadn't bothered to backport the fix.

 

4 hours ago, dpJudas said:

And when GZDoom statically linked the GLSL compiler the distro packagers wanted GZDoom to adjust to their way of doing things. 

 

I've been on the receiving end of this as well.  Hopefully one day soon I can have an AppImage to point to in order to bypass them as gatekeepers.

Share this post


Link to post

That’s not because of Linux, it’s because Linux’s users are more used in reporting bugs (including wishlist bugs) that Windows users.

I wish that Windows users report more bugs, not to make Microsoft ploy under the mass of whinings but in order to listen to the users instead of forcing them to do what Microsoft wants. Bah, I’m sure that Microsoft gets reports on what users do (but not what about what they want).

Share this post


Link to post
13 hours ago, AlexMax said:

Of course, what he doesn't realize is that not only have large corporations been subverting Linux for as long as I can remember (NVIDIA graphics driver anyone?), but the nature of the threat has changed and their gatekeeping is no longer effective at preserving their free software utopia, with WSL in the picture.  Plus AppImage, Snap and Flatpak - although they aren't the happy path, they at least exist as a way to subvert the ideal.

 

I am aware of the corporate perversion, and yes it has existed even in the directors board of... uh, I think it was the FSF.

 

And it's not like I personally like free software anyway; I'm talking about Linux, not myself. I'm just saying that is why Linux does things the way it does. It acknowledges that maintaining itself is a very political thing, and that's why you have thick-skinned people. I myself would rather OpenBSD if I could just run all my stuff on it. Heck, I should just give it a try.

 

Now, what are your priorities? You do realize that the Doom source release, a massive boon to the comunity, wouldn't exist without Linux, right? RIGHT?

 

10 hours ago, dpJudas said:

The users of Linux does not go to their Linux distro if your package doesn't build on theirs. They go complain directly to the project. Not only that, the distro packagers, also complain directly to the project whenever it doesn't do things the way that distro wants it to.

 

I already said this a million times -- those people are not parts of the equation, those are just users being cunts and not getting how stuff works in Linux. Gosh sometimes it's almost like you guys don't read 99% of what I say.

 

10 hours ago, Gez said:

This assumes that the only content of updates is security fixes, while retaining complete API and ABI compatibility.

 

Certainly we've never seen in the history of software that a library update would do things such as, for example, change the order of arguments in a function call, so that pitch and pan are swapped around, breaking the sound code.

 

Usually major version updates are put in their own packages. Gtk2 and Gtk3 are their own separate packages.

 

Any library that respects SemVer should have no breaking API or ABI changes unless it changes its major version.

 

Therefore, once again I waiver Linux of any fault here, I'm not being easy on Linux, trust me. This is the library maintainers' faults. wxWidgets is particularly notorious for sucking at inter-version compatibility.

Edited by Gustavo6046

Share this post


Link to post
1 hour ago, Gustavo6046 said:

Now, what are your priorities? You do realize that the Doom source release, a massive boon to the comunity, wouldn't exist without Linux, right? RIGHT?

I don't think anyone here is saying that they regret Linux exist.

 

1 hour ago, Gustavo6046 said:

I already said this a million times -- those people are not parts of the equation, those are just users being cunts and not getting how stuff works in Linux. Gosh sometimes it's almost like you guys don't read 99% of what I say.

They're not that easy to dismiss when you are the one who has to interact with them.

Share this post


Link to post
17 hours ago, Graf Zahl said:

GZDoom ran into this as well recently with Ubuntu shipping a broken version of Fluidsynth. We are in no position to fix the problem because the only way around it would be to include our own copy - which is - guess what - not a well supported scenario, made worse by a nightmare chain of dependencies.

Uhh unless you're thinking of a different issue that I missed this is false, the Ubuntu 20.04 fluidsynth2 works with GZDoom today just fine.  _mental_ fixed the issue in ZMusic.

 

And there was a workaround I implemented a work around in the DRD Team repo.  Apt supports having package alternatives so what I did was provide a package called drdteam-libfluidsynth1 in our repo which "provides" libfluidsynth1.  Since GZDoom suggests that libfluidsynth1 be installed, when 20.04+ users installed GZDoom it would pick up our copy of fluidsynth (since 20.04+ didn't provide the old library since it's a universe package).  Ubuntu 19.10- users simply used the libfluidsynth1 provided by the distro.

 

If we ignore that fluidsynth is a universe package (community support), this is also something that would have been covered by my desire to see distros support older ABIs longer since the fluidsynth1 to 2 upgrade wouldn't have been forced.  But honestly I'm more surprised that GZDoom was changed to load fluidsynth2 when it was crashing with it.  Given what the issue was I'm not sure what other distros would have done differently to make it work, but presumably the submitter tested it?  I don't know.

Share this post


Link to post
7 hours ago, Gustavo6046 said:

Now, what are your priorities?


My priorities are to get my software into the hands of users.  To make this happen, I want control over the distribution channel

 

On Windows and Mac, this is trivial, I just upload the packages that I assembled and I'm done.  Apparently on Linux all I can do is message the maintainers....plural I guess...and hope they notice and update my software

 

I think this is silly.  Apparently DRDTeam thought the same thing, and they've been running their own third-party APT repository so they can control their own distribution channel for Doom ports.  It's the smart play, and I don't think it's an accident that such repositories are so popular in the Linux world (EPEL, Remi, Chrome and LLVM), but of course it's an imperfect solution because of the distro divide.

 

7 hours ago, Gustavo6046 said:

This is the library maintainers' faults. wxWidgets is particularly notorious for sucking at inter-version compatibility. 

 

Then it is malpractice for distros to package it in its current state.  Either yank it or split it per version.

Edited by AlexMax

Share this post


Link to post
6 hours ago, Blzut3 said:

Uhh unless you're thinking of a different issue that I missed this is false, the Ubuntu 20.04 fluidsynth2 works with GZDoom today just fine.  _mental_ fixed the issue in ZMusic.

 

And there was a workaround I implemented a work around in the DRD Team repo.  Apt supports having package alternatives so what I did was provide a package called drdteam-libfluidsynth1 in our repo which "provides" libfluidsynth1.  Since GZDoom suggests that libfluidsynth1 be installed, when 20.04+ users installed GZDoom it would pick up our copy of fluidsynth (since 20.04+ didn't provide the old library since it's a universe package).  Ubuntu 19.10- users simply used the libfluidsynth1 provided by the distro. 

 

If we ignore that fluidsynth is a universe package (community support), this is also something that would have been covered by my desire to see distros support older ABIs longer since the fluidsynth1 to 2 upgrade wouldn't have been forced.  But honestly I'm more surprised that GZDoom was changed to load fluidsynth2 when it was crashing with it.  Given what the issue was I'm not sure what other distros would have done differently to make it work, but presumably the submitter tested it?  I don't know. 

 

The mere fact that the problem existed in the first place is enough to highlight the issue.

Yes, GZDoom was written against v1.0.

Yes, someone added support for 2.x because some Linux distros dropped 1.0.

Then some detail changed in 2.x that broke the music player. This version seemingly ended up in a package repo and chaos ensued.

Whether it got fixed in GZDoom or got updated in the package repo is not relevant - the only reason this was causing problems is that instead of shipping a complete set of binaries that were proven to work we depended on the package management and ended up with a non-functional combination.

 

 

Share this post


Link to post
3 hours ago, AlexMax said:

I think this is silly.  Apparently DRDTeam thought the same thing, and they've been running their own third-party APT repository so they can control their own distribution channel for Doom ports.  It's the smart play, and I don't think it's an accident that such repositories are so popular in the Linux world (EPEL, Remi, Chrome and LLVM), but of course it's an imperfect solution because of the distro divide.

 

 

There's not only the disto divide but also those Linux users with their unbroken belief in "the system" that they refuse to use such repos. So while it is definitely an improvement, it still doesn't fully shield us from the problems.

Share this post


Link to post

I’m using the DRD team Debian/Ubuntu repository… and sometimes I’d like that reportbug would report bugs of this repository.

Share this post


Link to post
22 hours ago, Gez said:

I don't think anyone here is saying that they regret Linux exist.

 

I'm not saying that, I'm saying that it is thanks to Linux being so adamantly against closed-source stuff like the plague, that we have Doom sourceports other than Doom95 today. Gosh, imagine if all we had was Doom95, Doom's popularity wouldn't even be a minuscule iota of what it is today.

 

13 hours ago, Graf Zahl said:

Then some detail changed in 2.x that broke the music player. This version seemingly ended up in a package repo and chaos ensued.

Whether it got fixed in GZDoom or got updated in the package repo is not relevant - the only reason this was causing problems is that instead of shipping a complete set of binaries that were proven to work we depended on the package management and ended up with a non-functional combination.


Ubuntu is just highly incompetent and Canonical are shady AF. Canonical is half the reason corporate interests are buried deep within the world of GNU/Linux. They have much bigger problems, and this Fluidsynth malpackage is just a symptom of that.

 

No, the fact a lot of people use Ubuntu, is not a direct measure of "quality". That's very naïve. It's the result of a lot of publicity and a how a lot of people get to hear about Ubuntu and often associate "Linux ⇒ Ubuntu" in their heads because that's all they hear about. It's marketing, it's capitalism.

 

And, on a side note, it's funny that capitalism makes you (as in a general kind of "you", not you specifically) think that the most popular things are those that work best and have the highest quality. I mean, look at Windows! Or McDonald's!

 

22 hours ago, Gez said:

They're not that easy to dismiss when you are the one who has to interact with them.

 

You mean you can't just... dismiss them?

 

You're not a babysitter! If that's what you thought your job consisted of, I feel sorry for you, it is clear it has left you a bit grumpy over the years :P

 

Sit down and have a coffee. Would you rather cappuccino or latte?

 

On 9/30/2021 at 1:29 AM, AlexMax said:

This isn't debate club.  You are not owed a point-by-point rebuttal, and I'm not here to debate the merits of the GNU/Linux ecosystems, partly because I've held and believed many of your positions in the past and partly because I'm too busy trying to warn you that the cathedral you hold so dear is on fire.

The enemies of GNU/Linux are on your doorstep and you don't seem to realize or appreciate it.

 

I'm not here to debate, though. We're just failing to reach a consensus about GNU/Linux. An argument is just a way to reach a consensus about something, argument is good, but of course argument for its own sake is bad and self-perverting in its finality – I'm not here for that.

 

I just have a hard time seeing Linux going down any time soon, there's always more people joining and a lot of people I know online use it as well. It's a bit famous online so more and more people might try it for the heck of it and end up really liking it. And the best thing a community-build ecosystem can have is more community.

 

So, how exactly will those corporate interests kill Linux? Embrace-Extend-Extinguish?

 

On 9/30/2021 at 2:24 AM, AlexMax said:

The thing is, at this point I think that trying to appeal to empathy towards third party developers is a lost cause.

 

I forgot to reply to this, but Linux is a lot more about its own community than the "third-party developers".

 

Most developers in Linux develop for Linux, but add compatibility with other formats and standards that common people use for the sake of integration and accessibility; cross-platform is most often an afterthought. And if you develop for Linux, you already tailor your software for Linux.

 

At this point, "third-party software" is basically any software where the reverse happens – when it isn't tailored for linux, or Linux support is a second thought. I think Wine programs fall under this category, but I think it's ironically superior to distribute prebuilt Windows programs and use Wine, than to try to build Linux versions, even if system-wide integration might be hurt, because Wine means it's kind of contained, it's safer, and it's less likely to mess up with library versions like you guys are concerned about. So I guess Linux can piggyback on Windows even though it doesn't need to. :)

 

On 9/30/2021 at 1:33 AM, Graf Zahl said:

And here's the real issue for a developer: If stuff breaks it won't be the library developers that get the brunt of complaints but the developer of the application, who most likely is the one person who cannot do anything about the problem!

 

[...]

 

So your lofty ideal of always patched software ultimately backfires, resulting in unstable apps, frustrated developers and as a worst case scenario no app at all because its developers call it quits.

 

I think the fulcrum of the issue is the part where the application developers get the brunt of it. I think this is the real issue – flawed accountability.

 

When stuff goes wack, there should be some sort of system, or a compendium of troubleshooting facilities, or just at all something made specifically to figure out where precisely stuff went wack, or whose fault it is. The issue would be if this itself goes wack... so yeah, it's... complicated.

 

But this already kind of is a thing: usually, application maintainers will defer a ticket in the bug tracker to a downstream library if it turns out the issue lies in the latter, and this may happen recursively till the real issue is found. At least this is how ideally it would be. But it can be a bit stressful when every single issue in the whole ecosystem lands in the hands of application developers. Maybe it is the best we can do, though. Apart from utilities that help with the part where the issue transitions from project to project. How about an unified network of bug trackers? :o

 

16 hours ago, AlexMax said:

My priorities are to get my software into the hands of users.  To make this happen, I want control over the distribution channel

 

On Windows and Mac, this is trivial, I just upload the packages that I assembled and I'm done.  Apparently on Linux all I can do is message the maintainers....plural I guess...and hope they notice and update my software

 

I think this is silly.  Apparently DRDTeam thought the same thing, and they've been running their own third-party APT repository so they can control their own distribution channel for Doom ports.  It's the smart play, and I don't think it's an accident that such repositories are so popular in the Linux world (EPEL, Remi, Chrome and LLVM), but of course it's an imperfect solution because of the distro divide.

 

APT repositories are only for Debian-based distributions. I personally don't like how Debian packaging works, and it's nothing to do with PPAs and such.

 

16 hours ago, AlexMax said:

Then it is malpractice for distros to package it in its current state.  Either yank it or split it per version.

 

"Split it per version" is how it is usually done, but only when breaking changes are reported, particularly with relation to libraries. The diligent distro will find the last version that works, and set a rule somewhere, that tells package managers to "not upgrade this package and its dependents, unless this specific, older version isn't provided anymore" – as for the latter, it'll be taken down once the latest version fixes said breakings, if ever at all. Of course, dependents that adapt (poor app.devs.) to newer versions of the library would override this rule.

 

 

----

I just want to design my own package ecosystem now. It'd be way better than all the ones distros have today. I'm starting to see your frustration and I want to fix it. But I don't really want to let go of the status quo of (the relative lack of) third-party packaging. Then again, sometimes, conventions are just made to be broken.

Edited by Gustavo6046

Share this post


Link to post
On 9/30/2021 at 10:05 PM, AlexMax said:

I think this is silly.  Apparently DRDTeam thought the same thing, and they've been running their own third-party APT repository so they can control their own distribution channel for Doom ports.  It's the smart play, and I don't think it's an accident that such repositories are so popular in the Linux world (EPEL, Remi, Chrome and LLVM), but of course it's an imperfect solution because of the distro divide.

Indeed.  The problem with getting software into distros is that their one size fits all policies tend to result in sub-optimal user experience.  For example someone wanted to get ECWolf into Debian, which required that my modified SDL_mixer be removed in place of the upstream version.  Doesn't matter that SDL_mixer is a simpler piece of code than people think it is and that I added a few crucial features for mod authors, if I couldn't get my changes upstream too bad they'd rather distribute a version that can't play adlib sounds with sampled music.

 

I'm not trying to argue that Debian's policies are bad, since for the core operating system it's an excellent policy to produce a more secure system.  But I'm pretty sure most people don't want their single player game crippled because of a security policy.

 

To that end, my point of view is that the distros should refocus to establishing binary contracts so that third party developers can distribute software themselves and basically adopt the kernel's policy of "we don't break user space."  I love stable distros like Ubuntu since I can just turn on auto updates and not have to worry about breakage, that's reserved to the semi-annual distro upgrade (and Windows and macOS break apps sometimes on their distro upgrades as well, so this is no different.  Oh by the way the fluidsynth2 thing only happened on fresh installs of 20.04 since distro upgrade wouldn't remove 18.04 or 19.10's fluidsynth1 since gzdoom was using it).  But I think most people are surprised when their non-system apps are out of date because of distro policy.  (Fortunately even the stable distros have figured out that this policy doesn't work with some apps like web browsers.)

 

Running an apt repo is me doing my tiny part in making the change that I'd like to see.  Which also makes it more of a perfect solution to me since I'm less concerned about supporting Linux as a whole, but rather supporting mainstream Linux by making it the path of least resistance.

9 hours ago, Gustavo6046 said:

Ubuntu is just highly incompetent and Canonical are shady AF. Canonical is half the reason corporate interests are buried deep within the world of GNU/Linux. They have much bigger problems, and this Fluidsynth malpackage is just a symptom of that. 

Fluidsynth is packaged by the community and not Canonical, but even that aside there's nothing wrong with the package.  Validating Graf's claim that earlier versions of fluidsynth2, which were never in a Ubuntu release, worked is not worth my time, but ultimately the issue at hand would have appeared on any distro.  Effectively the high level of the bug was GZDoom said "I support Fluidsynth v2" and then was surprised when it was handed v2.  The technical detail for why that happened is actually fairly interesting, and probably a result of Linux defaulting to a global symbol table.  (Even though Linux does support Windows like symbol tables, changing the default to this now would break pretty much all the software written for Linux so I guess we just get to put up with it.  This is why I have to do that preload libstdc++ hack to get the old UT binary running.)  This is where things could be interesting if one were to redesign the user space with a clean slate.

 

Linux using a global symbol table also makes it more susceptible to graphics driver issues since different drivers pollute the global table in different ways (and containers can't save you here since if you put the driver in the container then you can't support new hardware, if you put the driver outside of the container you have potentially incompatible symbols).  There is a project called libcapsule that tries to work around this, but it's incredibly complex to get working since the assumption of a global table runs deep within X11/GLX.

22 hours ago, ducon said:

I’m using the DRD team Debian/Ubuntu repository… and sometimes I’d like that reportbug would report bugs of this repository.

Would definitely be cool if reportbugs could send an email to the actual package maintainer (it probably could query that information, however I could see privacy concerns being raised about potentially sending crash reports to random parties), but alas it's just the Debian version of "send to Microsoft."

Share this post


Link to post

I just realized, that there are other things yet to be brought to the table here.

 

You can have control over distribution if you use AppImage. While it is a tad bigger than your usual prebuilt package as it has to include system libraries, it's not as bloasty as Snapcraft or Flatpak, and it's still very portable, and doesn't involve any compilation on the user side of things. I've heard it's easy to integrate with existing buildsystems, though you'd have to try it out yourselves. And, if you already know about that, there is no reason to complain about distribution in Linux – the problem is already fixed.

 

On the grander scope of Linux in general, I think Valve will make Linux viable for average people, what with Steam already having a standard set of libraries that games can link against. I guess for the most part not existing Doom source ports, since those do rely a bit more on having system paths and stuff, like in the case of Linux, reading configuration and other stuff from the conventional XDG directories. And with other releases, they'll .

 

Also, the corporate influence in Linux.... is mostly stuck in the Linux Foundation, something which will hardly affect GNU/Linux itself in the long run. Plus, even if the Linux Foundation were to someday completely disappear, or be overtaken by evil unscrupulous Big Tech overlords, GNU/Linux isn't going to suffer from it even a tiny bit, or going anywhere. It's not built by any authority or central figure, it's built by people.

 

Fedora Kinoite and the SteamOS 3 are good distributions for absolute "noobs", and PopOS and Garuda (and maybe Debian) a decent step-up for people who get slightly more acquainted with Linux, then right above that, Arch Linux (which is a few steps easier now that it has an actual installer and such) which will also motivate learning more about Linux. By this time you may be able to figure out where you want to go next, if at all. Now, do keep in mind that, while "steps" implies going up, that isn't needed. All of those let you use your computer well enough for common tasks and for gaming, and if you're happy and comfortable, that's what matters!

 

And even if distro-hopping does happen, learning will as well, and Linux really only looks daunting. It's not really as bad as it may seem, and it will also take away your command-line-o-phobia pretty quickly once you follow one of the tutorials that involve the command line – I mean, command lines are easy, and for many things just as easy as the familiar DOS commands! :)

 

Just a quick word of advice for those planning to install Linux: don't forget to have your /home be in a separate partition, maybe a btrfs one, so you can always keep it around each time you distro hop! :D

 

Edited by Gustavo6046

Share this post


Link to post
2 hours ago, Blzut3 said:

Would definitely be cool if reportbug could send an email to the actual package maintainer (it probably could query that information, however I could see privacy concerns being raised about potentially sending crash reports to random parties), but alas it's just the Debian version of "send to Microsoft."

 

There is an option in reportbug: upstream if the bug seems coming from the program and not from the package.

Share this post


Link to post
On 9/29/2021 at 11:23 AM, ducon said:

Why do China develops its own Linux distribution? I hate this dictatorship but if they do this, it’s because they do not want to depend on Microsoft, that is an American company. I won't use their distribution for the same reason that I won’t use Windows on my machine.

That, plus an OS you can modify is an OS you can use to spy on your citizens and have deep-baked protections to try to thwart people getting through the Great Firewall.

Share this post


Link to post
12 hours ago, Gustavo6046 said:

And, if you already know about that, there is no reason to complain about distribution in Linux – the problem is already fixed. 

 

I know about it, and I've got an AppImage in progress for Odamex.

 

However, for quite some time, I didn't even know that AppImage existed.  I thought that every non-trivial form of distribution involved either deb/rpm, distro-specific images like snap/flatpak, or creating your own poor man's image by using RPATH at which point you were on your own.

 

AppImage is a great idea, but because everybody is running their own agenda the happy path rarely leads in that direction.  I actually did some google searches last night for how to distribute Linux software and almost none of the available resources covered AppImage, preferring distro-specific solutions.  How are outside developers supposed to figure this out when most resources don't know about it and the ones that might won't cover it for ideological reasons?

 

In context, it's not surprising to me that third-party repos are so popular - most people think Linux software distribution begins and ends with apt get or yum or pacman or whatever because distros have been blurring the lines for years.  Also explains why images seem to be slow on the uptake, since why install a snap that has a 50:50 shot of being broken when the repo has it anyway.

Edited by AlexMax

Share this post


Link to post
21 hours ago, Dark Pulse said:

That, plus an OS you can modify is an OS you can use to spy on your citizens and have deep-baked protections to try to thwart people getting through the Great Firewall.

A distinct case of China being China. As usual.

Share this post


Link to post
On 10/2/2021 at 3:43 PM, AlexMax said:

In context, it's not surprising to me that third-party repos are so popular - most people think Linux software distribution begins and ends with apt get or yum or pacman or whatever because distros have been blurring the lines for years.  Also explains why images seem to be slow on the uptake, since why install a snap that has a 50:50 shot of being broken when the repo has it anyway.

 

Yup!

 

Although. isn't the whole point of a distro being able to tailor packages for itself, or of third-party repos being somewhat distro-specific, that the packages can then be tailored for the distro's own idiosyncrasies?

Share this post


Link to post
19 hours ago, Gustavo6046 said:

Although. isn't the whole point of a distro being able to tailor packages for itself, or of third-party repos being somewhat distro-specific, that the packages can then be tailored for the distro's own idiosyncrasies? 

 

You know how I tailor my programs for different versions of Windows or MacOS?  I don't have to, I just build my software.  Set it to be a release build, shove that and any dynamic libraries you need into a zip and you have a portable ZIP distribution.  On mac, just generate an App from XCode, shove it in a dmg, and you're good to go.  The hardest thing on either platform is multi-arch, and on Windows you just build two executables and on Mac you tear your hair out with lipo on your libraries for a day or two to get a universal build but that's about it.

 

On Linux, trying to corral all of the dependencies of Odamex for an AppImage is quite frankly driving me crazy.  I want to ship my own versions of SDL2 and SDL2_Mixer and due to the numerous dependencies of the latter it's slowly driving me mad, especially when I got to fluidsynth.  These are complete non-issues on Windows and the Mac due to the easy availability of pre-built libraries, but because of the tyranny of distro distribution and a culture where compiling from source is considered acceptable this is by far the hardest part of distributing standalone software.

 

Thankfully, I think I've found a solution that can handle building all of the dependencies for me.  And the funny part is...it comes from Microsoft.

Edited by AlexMax

Share this post


Link to post
On 10/5/2021 at 9:50 PM, AlexMax said:

Thankfully, I think I've found a solution that can handle building all of the dependencies for me.  And the funny part is...it comes from Microsoft.

 

What about conan.io, though?

 

On 10/5/2021 at 9:50 PM, AlexMax said:

this is by far the hardest part of distributing standalone software.

 

Then don't.

Share this post


Link to post
1 hour ago, Gustavo6046 said:

Then don't.

 

I must.  Linux users deserve better than the Linux distro distribution model, and I hope at the very least I can find a path of least resistance that I can write up in a blog post to encourage more developers to release standalone applications.  And those gatekeeping jabronis that call themselves "distros" can do two things about it: nothing, and like it.

 

...

 

the-rock-if-you-smell.gif.b98246974eadb181e38912bd6e848735.gif

 

(music plays: https://www.youtube.com/watch?v=qIDuh9Z_RWc)

Edited by AlexMax

Share this post


Link to post
2 hours ago, AlexMax said:

I must.  Linux users deserve better than the Linux distro distribution model, and I hope at the very least I can find a path of least resistance that I can write up in a blog post to encourage more developers to release standalone applications.  And those gatekeeping jabronis that call themselves "distros" can do two things about it: nothing, and like it.

 

Well, that's the thing; they generally do well with their own programs. I do dislike that there is always a tendency to be an alternative to mainstream, rather than integrate into it.

 

But I think, instead of bridging to traditional third-party distribution, a completely new way of distribution, that is distro-agnostic, not bloaty, package-manager-friendly, should be worked on. Maybe even one that is cross-platform!

 

An idea I have is to have a common API that package managers use, one that is not reliant on a Linux environment specifically (let alone a particular distro or set of directory trees - i.e. relocatable bin, etc, etc (pun not intended)). While the underlying packages may still very well have to keep in mind the multiple platforms (Windows/Mac OS/Linux), you wouldn't have to add a lot of variation. Maybe add source packages into the mix if you feel like attending to more advanced users' needs as well, e.g. static compilation with musl rather than glibc. I like tinkering with stuff like that lol, even though most of the time it's just a flop. Go me.

 

So maybe we just need to think outside of the box like that. Any ideas? I kind of don't want to do it the traditional way, is all!

Share this post


Link to post

What's really there to think about? Just look at the other platforms to see what works well and what does not.

If it comes to fully contained distributions the macOS model is really what should be attempted, but as usual with Linux the end result is always the same: endless discussions, thousands of people with hundreds of opinions with a significant amount that resists everything that alters the status quo and then nothing happens.

 

Yes, such a model could be integrated into package managers as well, but considering the amount of different package managers good luck to get them all behind your model.

It also won't work if installations cannot be done ad-hoc from a downloaded binary package because you need that to overcome the existing inertia.

Share this post


Link to post
14 hours ago, Graf Zahl said:

What's really there to think about? Just look at the other platforms to see what works well and what does not.

 

I don't think the other platforms do things well. A lot of people in the GNU/Linux community see many dealbreaking flaws in other OSes that end up being overlooked by regular Joes.

 

14 hours ago, Graf Zahl said:

If it comes to fully contained distributions the macOS model is really what should be attempted, [...]

 

Oh no, not walled gardens.

 

14 hours ago, Graf Zahl said:

[...] but as usual with Linux the end result is always the same: endless discussions, thousands of people with hundreds of opinions with a significant amount that resists everything that alters the status quo and then nothing happens.

 

If things were always like that, we wouldn't have, say, Pipewire. There are people who actually do things, who want to make it all better. What you said here is a bit of a sweeping statement.

 

14 hours ago, Graf Zahl said:

Yes, such a model could be integrated into package managers as well, but considering the amount of different package managers good luck to get them all behind your model.

 

It could be a separate package manager – a third-party one, if you will – that provides explicit support for other package managers.

 

14 hours ago, Graf Zahl said:

It also won't work if installations cannot be done ad-hoc from a downloaded binary package because you need that to overcome the existing inertia.

 

By ad-hoc you mean simply downloading a package and installing it, rather than needing a utility to do it?

 

I think that is simply impossible to do well, due to how different each system is, unless you somehow merge every Linux distro into a single standard which, by virtue of the freedom of choice that users and distros are empowered with, is but an utopia. It would be much more feasible to use a third-party package manager that provides explicit compatibility/integration with existing system package managers where applicable.

 

A lot of people would also have objections to having a binary package format where every single package includes procedures for installation.

 

And even if a million people have an objection to this model, I might still be arsed to implement it anyway. There will always be the insufferable kids complaining and objecting to anything that dares change the status quo, much like you said, but then there are people who are genuinely interested in seeing the Linux ecosystem improve, and who are willing to concede to the unavoidable subjectivity when discussing matters like this. And of course, no words can stop you from going ahead and implementing something if you really believe in proving them all. And I do.

 

And, once you do prove them all, you might as well start getting more contributions from people who are now convinced your solution works. That's the nature of free software :)

 

10 hours ago, Maes said:

8 (crap)

 

Are we still fussing about that? o.o

Share this post


Link to post
Guest
This topic is now closed to further replies.
×
×
  • Create New...