I'm not surprised that GNOME would contemplate this: it seems like a very GNOME-y thing to do. (And maybe this is a good thing for GNOME users. But this kind of thing is also why I use KDE.) But why would Firefox feel the need to touch this? Surely this is a DE-level setting, and Firefox should simply go along with the DE behavior so that its behavior is consistent with the rest of the apps running in the desktop session.
Middle-click paste is faster than Ctrl-C-then-Ctrl-V. And if you are a developer, you often find yourself copying long strings from one place to another. And it has been a standard behavior on Linux since the 1990s. This just seems to me like more of GNOME's "simplicity at all costs" run amok. I'm a power user. I want poser user features. So I use KDE, a project run by people who care about power users and their needs. I'm happy for you if GNOME meets your needs.
But in all seriousness, why are we moving away from usability features ingrained into muscle memory after 30 years across all desktop computing platforms?
Have we not learned in the past 10 to 20 years that pandering to the lowest common denominator destroys good software and turns it into an enshittified mess?
We need to stop removing features because people are too stupid to use them.
If they’re too stupid to use the features they’re too stupid to use the product.
Shouldn't the first sentence on that website describe what GNU Unifont actually is? I guess it's a single copyleft font designed to have coverage of all (or nearly all?) unicode code points?
Well, the second and the third sentence describe very precisely what Unifont is:
"This page contains the latest release of GNU Unifont, with glyphs for every printable code point in the Unicode Basic Multilingual Plane (BMP). The BMP occupies the first 65,536 code points of the Unicode space, denoted as U+0000..U+FFFF."
This is suitable as a last resort font, which should display any character for which no match was found in the other available fonts.
This is normally preferable to a last resort font that just displays the number of a character not available in your preferred fonts.
> Designed for GUI interfaces, terminals, or print?
Given it’s a last resort font, I think it doesn’t make too much sense for print (unless you’re printing something that could be in any possible language).
Saying a font is designed for print doesn't mean it's for literal professional printing.
It just indicates that the x-height isn't increased the way it often is for a font designed specifically for screens, and that you can have finer details like serifs and thinner strokes. It just means it's intended for high-resolution viewing.
Yeah I thought maybe the "uni" in "unifont" meant it was a single font that would morph between serif and sans somehow. I guess it stands for "unicode", from an era when Unicode support was not table stakes.
Unfortunately I’ve often seen such things in tech - the more “purist” or deep or nerdy something is, the worse he explanations, UX/UI, and explanations.
A GitHub readme for some software that sells a subscription (or is meant for “average” users) will have way more explanations and screenshots than something that’s more technical. HN has a “leaner” (worse for mobile) interface than old reddit, while both are way better than new reddit.
And god help you if you want to understand the chain of context on a Linux mailing list (email?) thread. “What, you’re not savvy enough to know the arcane and totally unintuitive stuff we use to format and can’t make sense of it? Too bad, sounds like user error.”
Yeah this turned into a rant, but seriously, little polish goes a long way in usability.
Note that "nearly all" isn't "all". I have some side project that require rendering of very uncommon CJK characters, and Unifont does not display them as expected. (For that project, I used https://kamichikoichi.github.io/jigmo/ which was the font that was most complete in terms of CJK glyphs )
Unifont seems to have about the same glyph coverage as my system default CJK font (unfortunately I don't know what it is).
Do you know if those characters are in supplemental planes? The BMP would only be glyphs from U+0000 through U+FFFF (though the first 32 and last two aren't printable, and wouldn't be included in this font).
Another example would be emoji, which would probably now be considered "basic" by most people but have always been in a supplemental plane.
> GNU Unifont is part of the GNU Project. This page contains the latest release of GNU Unifont, with glyphs for every printable code point in the Unicode Basic Multilingual Plane (BMP)
Still doesn't exactly say what it is? I get that it's glyphs for printable characters, but honestly it could be a PDF, video, collection of PNGs or SVG files, an Adobe Illustrator file, a linux distribution, a web browser, or pretty much any other piece of software or data format. I presume it's a TTF or OTF font file?
no.
as others have stated too, the following should be mentioned
- what's the 2 meaning in BMP
- it's designed as a monospaced (or proportional?) bitmap font
- designed in a single 16x16 size only (or also 8x16? it's a bit unclear)
- provided as an OTF/TTF font format, which can be scaled by most font rendering engines to other sizes, but u need antialiasing to make it look smooth (this is mentioned, but under the download section only)
- use as a "last resort" default font, according to wikipedia at least
So, first off, it seems like Jonathan Riddell not working on KDE anymore is a huge loss for the KDE community.
But I'm not sure I really understand what happened. AFAICT, Valve had a contract with Blue Systems, specifically a subunit of Blue Systems that does KDE development. Blue Systems decided to sell that subunit to Techpaladin, Nate Graham's company. Riddell was unhappy about this, and proposed that... I guess that Blue Systems not sell to Techpaladin? Or that Techpaladin reorganize itself into a worker-owned company? And then when Graham declined to do this, stuff happened, and eventually Riddell got fired from Techpaladin, or not hired by Techpaladin, and now Riddell is not getting paid to work on KDE. So Riddell has (not unreasonably) decided to stop working on KDE. And the other people who once worked for Blue Systems and now work for Techpaladin have decided to keep working for Techpaladin.
This is my takeaway as well. Having brushed up against a few things like this, I assume there is more back-and-forth about reorganizing Tech Paladin than what is published. However, I am certainly not suggesting an escalatory exposé of communications, as it likely does nothing to change the two bottom line facts: Riddell was not included in the new company, possibly over their disagreement regarding company structure, but this isn't something we could know for certain. Following that, Riddell decided to end contributions to KDE.
My PhD research was actually studying the leech nervous system. They're still an important 'model' organism in neurobiology. Probably not as important in the field at large as they were in, say, the 1970s, but still. They're also a good system for neurophysiology education, because they are cheap and easy to obtain, have large-ish neurons that are identifiable from animal to animal, and their nervous system has a relatively simple organization.
Strongly endorse. That paper is really wonderful. It seems to me that MVS is the solution to the version selection problem, and now we just have to wait for awareness of this to fully percolate through the developer community.
The post title implies that all French villages have no more drinking water, which is not the case. Sounds like it's 16 villages, all near each other. Still a big deal, but not nearly as bad as if it was all of France.
Basically, he argues that application distribution outside of the distro (a la flatpak, snap, appimage) is just a bad model. The right model is the one distros have been using for years: You get software through the distro's package manager, and that software is packaged by people working on behalf of the distro. As he says: "Software distributions are often volunteer-run and represent the interests of the users; in a sense they are a kind of union of users."
The other issue, of course, is that in practice flatpaks/snaps/appimages never seem to 100% work as well as distro packages do.
I disagree with that. IMHO the best possible people to create a package for an application are the original developers of that software. If that software is proprietary, that also happens to be the only party that can legally do that anyway. Because it typically requires access to the source code and software redistribution requires permission.
So, the model you mention only works for open source packages. And I would argue that even in the case an app is 100% open source it's a bad idea for somebody not affiliated with the core development team to be second guessing a lot of things about that application.
It results in a lot of issues that aren't necessary. Like needless lag between developers releasing new software and some third party doing whatever uninvited tweaks they think are necessary, or adding their own bugs and new issues.
It's why I always install Firefox in tar ball form straight from Mozilla for example. It updates itself as soon as developers OK some patch. This happens a lot and mostly for security and stability reasons. I want those patches when they release them. The things external distribution maintainers do are redundant. I trust Mozilla to do the right thing and be the most clued in about any issues regarding their own software. With proprietary stuff, I just want stuff to run with a minimum of hassle.
Flatpak is trying to do too many things. It's trying to emulate an appstore. I don't necessarily like app stores. They are gate keepers. What do developers on Windows and Apple do? They put binaries on their own website. You download them. You install them. And then they run. Downloaded apps have the same rights as apps provided via app stores. The app stores don't repackage the app, they merely distribute them. It's an add on service. An optional extra. All the essentials that provide security are baked into the OS and the application package. There are a few mechanisms that windows and mac provide to make things secure. Binaries are signed, the OS has a permission model for things that need that (screen sharing, directory access to certain things, using the webcam, etc). That's the right model. That could work for Linux as well. It shouldn't require taking control of distribution or packaging by some third party.
Flatpak is more of a set of tools and framework. I wouldn't consider it as a store but a distribution system. Flathub is a repository, Fedora has its own repository and anybody can creates its own repo (I wouldn't call it store as there is no concept of monetisation).
I wouldn't consider flatpak as a gatekeeper as there is no "team" going through some arbitrary process to allow/disallow an app.
I also disagree with the fact that macos and windows did the right thing, what I found in my experience managing laptops in a company that is roughly 1/3 windows, 1/3 linux, 1/3 macos is that:
- What windows is teaching users is to download random stuff and bypass the warning screens if something is not signed. Unless there is a company policy and a third party software to update what is installed, by default things installed are a mix of up to date and not update to date software.
- Macos user do not install operating system and software updates unless a third party software is installed and force them too
- Linux users have things up to date, only distribution version updates (e.g. fedora 41 to fedora 42) are inconsistent.
So my take is that, even if things on not perfect with flatpak, rpm/dnf, fwupdmgr and package manager, this is much better than having to pay for third party tools in macos and windows because of the lack of a good way to distribute and maintain apps at the operating system level.
Only fedora can put stuff in their flatpak repository presumably. That makes them a gatekeeper. Why is a repository needed? If it was the same, Mozilla would be able to put a flatpak file for Firefox on their website and it would be the preferred way to install Firefox.
Of course everybody (including Mozilla) can create their own repository and then you can install from any repository you like. But how is that different than just downloading whatever and installing that? And that's more of a hypothetical. Mozilla doesn't do that and doing such things is not common.
What Apple and MS enforce via signatures is that what you install and run was produced by somebody with a valid certificate that passed their screening.
The problem flatpak hasn't solved is that the likes of Mozilla still have no good way to distribute the most recent version of their application to all Linux users. So they put a tar ball on their website instead.
Mozilla publishes firefox on Flathub and anybody can install it from there. After, I'm not sure why they don't advertise it, most apps distributed this way just have a button that do so.
Fedora has its own repo, they manage it, i don't see the problem there. After it doesn't prevent adding others like flathub and the experience from a user point of view is the same.
You can also provide a flatpak ref file that user can use to install.
Signing app doesn't means much appart that someone paid for that and went through a process IMO, there's not much value to it from the user pov, especially when the first thing a Windows user learns is to ignore signature warnings.
I think that you are right about not depending on one open source OS to provide the proper depencencies, customizaion, and training wheels for every app. I have been running linux on my desktop for about 20 years, about one decade of Mint followed by the same of Fedora so far. Being a curious but fussy guy who installs lots of software to see what works, I find that I need to install a fresh OS about every 18 to 24 months.
There are, I suppose, always a few programs that don't get upated by 'sudo dnf update' but do get bothered by updates to the shared libraries via the same. Perhaps there are some config files that get damaged by software bugs or power outages or system crashes or my own mistakes and carelessness. I also found out that if one loses the dnf program, one will discover just how impossible it is to pull oneself up by oneself's bootstraps.
Mint was a very similar situation. Maybe not so bad for one who follows all the rules, but in those bygone days there were people suggesting that updating Mint programs with newer versions fron the ubuntu or debian repos was a good idea. And because Mint was slow to get updates, I would attempt to update some apps by downloading source and building and installing here.
Last year, when I upgraded Fedora from 39 to 41, was the first time I got any OS upgrade to work instead of wiping the disk, doing a fresh install of the new OS version, and then spending a week or month trying to get my data for the installed apps (eg web browser and email) from backups. But the upgrade took much longer than it should have, because once I started running the upgrade process, I did not know that the computer sitting there dead silent with no action on the screen for about 30 hours was a sign that all was going well. Computers are evil.
The problem is that now you have to package for N distros. And the people who run the distro may not want to spend time on it, so you have to do it yourself.
It doesn't have to be gated by "the people who run the distro". I started packaging a few pieces of software for a distro I use because I wanted to use that software, and I don't "run" the distros in any capacity. Package maintainers aren't born that way, they become that way by volunteering, just like most everything in Linux.
If you don't have even one user willing to do that for the distro they use, you probably weren't going to have users on that distro anyway.
> Package maintainers aren't born that way, they become that way by volunteering, just like most everything in Linux.
I feel like there's a constant tug of war on this issue. If you leave it up to app developers then they have to package their app for N distros. If you leave it up to the distro maintainers then they have to compile N apps for their distro. I don't envy either group given how different distros are and how varied apps are in quality, methodology, etc.
I look at Podman. In my opinion it could be (could have been?) a huge disruptor, but its RedHat (or Fedora or CentOS or whatever the hell those guys do now) versions are way higher than versions for other distributions, which creates for me (just a home user) an interoperability problem between all my different Linux boxes. RedHat if anybody has the resources to fix this but I guess they'd rather try to use it as a way to force adoption of their distro? I don't even know.
Both the apps and the distros are volunteer-heavy. App packaging is a big job for either side. I'm still hopeful that Flatpak can help that job
If you are unwilling to use tools like Flatpak, then that limits what distros you can make. e.g., in a world without Flatpak, only distros with X users can exist. In a world with Flatpak, distros with X/10 users can exist.
Another way to think about it: if you want to make/use your own distro, then using Flatpak will cut down the amount of work you have to do by some large multiple. You're free to not use it, just like you're free to install custom electrical sockets in your house and make custom adaptors for every single appliance you buy.
Standardization/centralization exists for a reason.
You're saying the exact opposite of the original point, which is: you should not package for distros, distros should package for themselves. You just distribute your sources.
You are a good candidate to package for your distro, so there's that. And then for a random distro, if nobody feels like packaging for it, then it's just not there. Either there is not enough interest in your project, or there is not enough interest in the distro itself.
> distros should package for themselves. You just distribute your sources.
Is Devault basically saying that the application developer should just throw their source code over the wall and hope that other parties notice and figure out how to build it correctly? I would find that model of software distribution unsatisfying as a developer because merely distributing a source tarball and leaving the rest to middlemen makes it difficult for me to predict how my users will experience the final product. Even if my product is fully open source and free to fork, it's my reputation on the line when things don't work as intended. I would prefer to establish a more direct relationship with my users; to personally build and test my software in all environments that I support; and to hear directly from users whenever they experience problems.
> Even if my product is fully open source and free to fork, it's my reputation on the line when things don't work as intended
I think that everyone who is worrying about that wants to apply corporate thinking on the open source model. Meaning they want to be a special thing where everything is supposed to be interchangeable. Just yesterday, I was compiling a program that hard depends on the GNU C library for just 2 functions and not even critical one. To be fair, the author said that they only test on Debian.
While the linux world may be fragmented, the true differences are mostly minimal (systemd vs other init system, glibc vs musl, networking manager,…) So it’s possible to decouple yourself from these concerns if you want to. But often the developer hard depends on decision made by their preferred distro team, and create a complicated build script that only works there.
I don't know what Devault says, but here is my opinion: do not ship something you don't understand/test/use yourself.
Distros should not package random open source projects they don't use/understand, and developers should not package their project for distros they don't use/understand. For both, it's like shipping untested code and the conclusion is always going to be "you should all run the same system I do" or "we should all have the exact same system, let's implement Flatpak".
Developers should package their project for the distros they support (often that's just Ubuntu). Random people should package the open source projects they want to use in their distro of choice (the more popular the distro, the higher the chance that someone else has done it already). All that under the supervision of distro maintainers.
> distros should package for themselves. You just distribute your sources.
That's how you ended up with Erlang being split into 20+ packages on Ubuntu/Debian in the past. Because it was packaged by people who know little about erlang, and had too much time on their hands probably.
And that is the main issue: you want distro maintainers to compile and package every single pieces of software under the sun, but they can't possibly know every piece of software, how it works, or how it's supposed to work. Times that by the number of distros.
> you want distro maintainers to compile and package every single pieces of software under the sun
No. I want people who will actually use the package to package the software they need, and distro maintainer to supervise that.
> Because it was packaged by people who know little about erlang
Yep, people who won't use Erlang shouldn't package Erlang. But on the other hand, developers who won't use Erlang on platform X shouldn't package Erlang on platform X.
The "we absolutely need flatpak because otherwise it fundamentally doesn't work" philosophy is, to me, very close to saying "we must consolidate everything under one single OS. Everybody should use the exact same thing otherwise it doesn't work". That's not what I want. I want to have freedom, and the cost of it is that I may have to package stuff from time to time.
If you don't want to contribute to your distro, choose a super popular distro where everything is already packaged (and used!). Or use macOS. Or use Windows. You don't get to complain about Alpine Linux not having a package you want: you chose Alpine, that was part of the deal.
Alpine is a great litmus test for programs that unnecessarily depends on glibc and systemd. More often than not, it’s easy to take the arch build script, and create a package for alpine. When that fails, it’s usually for the above reason.
> I want people who will actually use the package to package the software they need, and distro maintainer to supervise that.
Erm... Your original comment said "you should not package for distros, distros should package for themselves. You just distribute your sources."
> Yep, people who won't use Erlang shouldn't package Erlang. But on the other hand, developers who won't use Erlang on platform X shouldn't package Erlang on platform X.
So... Who's gonna package it if you say that distros should package it?
> The "we absolutely need flatpak because otherwise it fundamentally doesn't work" philosophy is, to me, very close to saying "we must consolidate everything under one single OS.
Bullshit.
What you advocate for is "why bother with ease of use and convenience, everyone should learn how to compile and package everything from scratch"
> If you don't want to contribute to your distro
The user of a package doesn't necessarily know how to package something, and shouldn't need to.
> Erm... Your original comment said "you should not package for distros, distros should package for themselves. You just distribute your sources."
Yes. I said "distros", not "the distro maintainers". The distro is the maintainers + the packagers, and packagers can be random contributors (I package stuff for my distro when needed, but I am not a distro maintainer).
> So... Who's gonna package it if you say that distros should package it?
People who will use Erlang on that particular distro. Under the supervision of the distro maintainers. There is typically some kind of hierarchy where there are the "community" packages that are just "untested" (sometimes they can get promoted to a more trusted level), and the "core" packages that are handled by the distro maintainers.
> What you advocate for is "why bother with ease of use and convenience, everyone should learn how to compile and package everything from scratch"
Not at all, but it seems like you don't know how it currently works in traditional distros, and you don't understand what I'm saying (probably I'm not being clear, that's on me).
What I advocate seems absolute common sense: "the package maintainer(s) should understand and use the package on the distro for which is is packaged".
The vast majority (probably almost the totality of) users of Ubuntu or Arch have never had a need to package anything, because everything is already there. Because those distros are very popular. Depending on your choice of distro, it may happen that a package hasn't been contributed or even that it doesn't compile (e.g. if you use musl). In that case, if you want it, you need to contribute it. But if you use musl, you implicitly accept this and are supposed to know what you are doing.
> The user of a package doesn't necessarily know how to package something, and shouldn't need to.
That's your opinion. I would say that a Gentoo user is expected to have some idea about compiling packages, otherwise they should not use Gentoo. Ubuntu is targetting people who don't want to know how it works, that's fine too. Diversity is good.
What I don't like, is Windows-minded people ("I shouldn't have to understand how my computer works") who come to Linux and push for everybody to become like them. "We should all use systemd and Flatpak, and pay one team of 50 people who know how that works, and the rest of us should just use it and not know about it" -> I disagree with that. Those who think that should just use Ubuntu/Windows/macOS and leave me alone. And for those who use Ubuntu, they should remember that they don't pay for it next time they say "it's shit because it doesn't do exactly what I want".
So who's going to maintain the packages? Who's going to test them against other packages? Against distro upgrades? Who's going to fix issues?
> Not at all, but it seems like you don't know how it currently works in traditional distros
I do. A small number of people are doing the thankless job of packaging, maintaining, fixing, testing a multitude of packages.
And their efforts are needlessly duplicated across several packaging systems.
> What I don't like, is Windows-minded people ("I shouldn't have to understand how my computer works") who come to Linux and push for everybody to become like them
What I don't like is people assuming ill intent behind "you know what would be great? If we didn't assume that every user has to package their own packages across 15 different incompatible packaging systems".
> So who's going to maintain the packages? Who's going to test them against other packages? Against distro upgrades? Who's going to fix issues?
I feel like you're not reading what I'm writing. The community.
That's how open source works: if you use an open source project and it has a bug, you can fix it and open an MR. If the upstream project doesn't want your fix, you can fork. Nothing forces the upstream project to accept your contributions. When they do, they take the responsibility for them (to some extent, as in: it is now part of their codebase).
If your distribution doesn't have a package you want, you can make it for yourself, locally. You can contribute it to a community repo (most distros have that). Maybe at some point, the distro maintainers will decide to take over your package in a more official repo, maybe not. Even if you are not the official maintainer of a package, if you use it and see a problem, you can contribute a fix.
In the open source world, most people are freeriders. A (big) subset of those feel entitled and are simply jerks. And a minority of people are not freeriders and actually contribute. That's the deal.
> And their efforts are needlessly duplicated across several packaging systems.
No! No no no no! If they don't want to put efforts into that, they don't have to. They could use Ubuntu, or Windows, or macOS. If they contribute to, say, Alpine or Gentoo, that's because they want to. I am not on Gentoo in the hope that it will become Ubuntu, that would be weird. But you sound like you want to solve "my Gentoo problems" by making it look more like Ubuntu (in the idea). Don't use Gentoo if you don't want to, and leave me alone! Don't try to solve my problems, you're not even a Gentoo user.
Funny how in reality it's not how open source works. Packages are en masse packaged and maintained by a very small number of maintainers doing a thankless job. Not by some "community" where "a person who uses the package" suddenly wakes up nad says "you know, I'm going to package this piece of software"
This is literally the reason for my exmaple with Erlang in my original comment.
> n the open source world, most people are freeriders.
I'm getting tired of your rants and tangents
> No! No no no no! If they don't want to put efforts into that, they don't have to. They could use Ubuntu
You're deliberately missing and/or ignoring the point.
Ho many package managers and package format are there? Packaging some code for each of them is wasted/duplicated effort because it's doing the same thing (packaging) for the same code (for example, Erlang) for literally the same operating system (Linux) just because someone has a very subjective view of "the one true correct way".
So now you have someone packaging, say, Erlang, for dpkg, flatpack, nix, pacman, rpm, snap and probably a few others because "people are not freeloaders" or "non-windows-minded people" or some other stream of consciousness.
> Don't use Gentoo if you don't want to, and leave me alone! Don't try to solve my problems, you're not even a Gentoo user.
I've said all I had to say. You deliberately chose to talk only to the voices in your head. Sorry, I'm not privy to those voices.
> Funny how in reality it's not how open source works.
Let me copy the full sentence, with the part that you conveniently truncated: "That's how open source works: if you use an open source project and it has a bug, you can fix it and open an MR. If the upstream project doesn't want your fix, you can fork. Nothing forces the upstream project to accept your contributions. When they do, they take the responsibility for them (to some extent, as in: it is now part of their codebase)."
Can you explain to me how this is wrong?
> I'm getting tired of your rants and tangents
How is that a rant? That's almost by design: I make my code open source so that people can benefit from it for free under some conditions. Take the billions of computers running Linux. Which proportion of those are run by people who actually contribute to Linux, do you think? As a good approximation, it's ~0%. Almost all users of Linux don't contribute to Linux. It's a fact, not a rant.
Nowhere did I say that people should contribute.
> Ho many package managers and package format are there?
Who cares? If I want to create a new package manager with a new package format, why would you forbid me from doing it? That's my whole point: people are free to do whatever they want. Are you saying that I must use Flatpak instead of my favourite package manager because you have decided that it was better for everybody?
Why do you stop at package managers? In your view, isn't having different OSes is wasted/duplicated effort? Should we all just use Windows because it's your favourite and you don't understand why other people may have other preferences?
> Sorry, I'm not privy to those voices.
My point is that whenever somebody says "it's stupid, we should all use X", my answer is always "If Y, Z, A, B, C, ... exist, it's because other people, for some reasons, don't want X. Because you like X doesn't mean that everybody should like X. I see how it would be convenient for you if everybody used exactly your favourite system, but the reality is that we can't all love exactly the same things. Hence there are alternatives. Diversity is good".
Diversity is good. I don't say that Flatpak should not exist. I just say that whoever wants me to use Flatpak is fundamentally missing something.
Application developer should be able to package and distribute the app. See how easy it is for casual user to download and install any application on windows. Maintainers cannot scale and depending on them will just hold back Desktop Linux
Flathub is not unvetted. Every submission goes through human review. If a piece of software requires an unnecessary permission (i.e. if someone submits an alarm clock program that requires home folder access and internet access), it will get rejected. If a developer updates their software and changes the required permissions, the update won't get pushed to users until it goes through human review.
Besides this, for open source packages, the code gets built on Flathub's build servers without internet access. The source code associated with a given Flathub package version must be either a specific Git commit (verified with a commit hash) or a release tarball (verified with a sha256 hash). This means that it's always possible to verify that the code a developer publishes corresponds to the binaries being shipped to users. Closed source packages get a big warning on their Flathub pages saying that the program's code is proprietary and not auditable.
With the traditional distro packaging model, the requirements to become a maintainer are stringent and there's human review when a package is added, but there's typically no review after that point. If you'd like a recent example of the drawbacks of this system, see here: https://security.opensuse.org/2025/05/07/deepin-desktop-remo... . After the OpenSUSE security team rejected certain components of the Deepin DE for containing major security problems (including multiple root privilege escalation vulnerabilities), the Deepin maintainer smuggled them in anyway through an innocuous looking package called "deepin-feature-enable" and nobody in the security team noticed for several years. I'm not trying to call out the OpenSUSE security team here, I'm sure they don't have the resources to vet random packages. I'm saying that expecting maintainers to never ship malicious code because they went through the process to become a maintainer is a weakness of the traditional distro packaging model.
They do to some extent in the larger distros, but for proprietary/binary packages they don't have much chance anyway unless they are willing to do some pretty time-consuming forensics.
Plus the app developers at least have some level of accountability. Like when JWZ got into it with Debian (can't link here). You might think you are running XScreensaver from the great Zawinski, but no you are actually running some weird fork from godknowswho, hopefully not Jia Tan.
You got downvoted but yes, it's quite sad when distros release a package under the same name as the original but with their own set of patches. I think they should rename the package when they do that, even just a prefix/suffix with the distro name would be nice.
I'm glad flaptaks are getting more adoption. Application distribution needs to move from distributions because they suck at it. Due to no fault of their own. Developers should have the option to distribute their apps without middlemen.
In fact I’d say they are perfect for distributions to be more stable. E.g. : my issue with Debian have always been that you couldn’t (easily, I know backports existed) have stable system AND fresh software. With Flatpack, you can.
Now I can run my latest user softwares on a stable distribution. That’s pretty cool.
There are still issues of UX. Especially when the app you are using doesn’t have enough permissions to do the job, you have no information about it and when you guess it by yourself, changing this is hard.
I’d expect that Flatpack should allow apps to specifically ask for permissions in real time or when they try to access external resources like in macOS : just expose the APIs but make them wait for user approval.
> Now I can run my latest user softwares on a stable distribution. That’s pretty cool.
I'm at a bit of a loss. Isn't the entire point of a stable distribution _not_ having cutting edge userspace? It's an inherently double edged sword.
If you just wanted to mix and match you were always able to run (for example) a debian testing chroot under debian stable. Something like Nix is the more extreme version of that. The point of something like Flatpak then is either sandboxing or the distribution model (ie getting software from the original author).
These days, I’m tempted with Debian stable because of people playing cowboys with software updates, breaking workflows right and left. There’s always VMs for bleeding edge.
There are certainly some aspects of it that are inelegant, in the interests of backwards compatibility, but otherwise I don't know what you are talking about. Matlab supports >2d arrays just fine, and has for at least 20 years.
reply