Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
YouTube is crippling Firefox on Asahi Linux? (treehouse.systems)
229 points by RMPR on Dec 12, 2023 | hide | past | favorite | 85 comments


So they are serving lower resolutions to ARM devices with no GPU acceleration. Doesn't sound like deliberately crippling Asahi Macs to me.. more like a reasonable default assuming those devices are something like a raspberry pi. I'm assuming it is just a default and you can select whatever resolution you like.

Detecting if the device has 20 cores or whatever like he says if possible would be more invasive of privacy. They would be criticized for that too. There's just no winning lol


Figuring out good defaults for this is really hard. In general, "Linux + aarch64 = lower performance = let's do lower resolution by default" seems like a reasonable thing, although not fool-proof obviously. Also note that the "is_arm()" is followed by "|| is_android()"-bit in his screenshot.

Asahi Linux is a comparatively small project with few real users. It's pretty arrogant to assume Google is intentionally targetting them specifically.

If anything, the bug here is that Chromium uses x86 on arm systems, although for Asahi specifically that's probably the right choice, it's a lot less clear to me that's the right choice for all systems.


And the primary targets for that are probably ARM Chromebooks which tend to be underpowered by quite a bit.

(I'm guessing those report aarch64 correctly.)


Well, that and also when you set "Desktop site" in mobile Chromium it will report a user-agent which matches that of desktop Linux on ARM. So it is most likely trying to target those devices as well.


> It's pretty arrogant to assume Google is intentionally targetting them specifically.

The Asahi Linux developers have a history of taking everything personally. If Apple adds a new way to start an ELF in macOS 12.3, it's because of them. If someone criticizes them publicly on Hacker News and isn't immediately shamed, it's because the actual leadership at Hacker News hates them (not an exaggeration considering their public stunt here where they blamed @dang for everything). If someone doesn't quite agree with their politics in perfect lockstep, they will publicly try to force you to resign (look at what happened with Dlang). And, more recently, threatening to keep changes downstream and out of the mainline Linux kernel as retaliation for what they call inappropriate conduct (which, well, if the COC of Linux hasn't been violated, I'm guessing it's probably them). And of course, if it's found out that the M2 has a technical bug with audio processing that literally nobody noticed until now, it's proof that Apple is full of incompetent idiots, unlike them.

It's also why, if I were running a corporation, I would almost require that anyone using Asahi on their Mac use a corporate fork for their own protection. I wouldn't rule out retaliation.


I see you've met marcan :)

He's a hotheaded spaniard who happens to live up to the stereotype perfectly. Except the part where he doesn't live in Spain anymore.

Very talented hacker. Best to watch from the back seats and enjoy the drama without getting any on yourself.


> corporation, I would almost require that anyone using Asahi on their Mac

Any corp worth their salt issues standard locked down machines, not letting employees install whatever they want, especially as an OS.


>Figuring out good defaults for this is really hard.

Just in general it's hard... for many things I find. There's perpetually a bunch of seemingly reasonable "yeah but" for almost every default once you hit a certain number of variables / use cases / differing users / customers and etc.


You are assuming incorrectly. The bug report linked in the thread the user filed with Mozilla indicates that 4K resolution isn’t available even though once he changes the user agent string it was and it worked without issue.


According to the followup toot, Youtube seems to assume Firefox + Linux + aarch64 = HiSense 65a67gevs. This was reported elsewhere too: https://bugs.launchpad.net/ubuntu/+source/chromium-browser/+... ; that issue was then forwarded to Chrome's bugtracker (https://bugs.chromium.org/p/chromium/issues/detail?id=150011...) which was closed with the comment "I'm hopeful they'll eventually get around to fixing the bug that you've reported".

It sounds less like "let's help the poor Raspberry Pi users" or "let's make Asahi's experience worse" and more like "workaround for a buggy TV that nobody bothered to implement correctly".


If you want to know the media decoding capabilities of a Web Browser, you can use the MediaCapabilities API. In Firefox, when flipping the "resist fingerprinting" flag, it's spoofed appropriately based on a study of what the most common results are. This is available on all engines, desktop and mobile.

Depending on the implementation, what it returns can be based on the presence of optimized software decoders for a platform, presence of hardware, resolution and other characteristics of the video, it can also be based on a decoding benchmark that the web browser runs, etc.

https://w3c.github.io/media-capabilities/

https://developer.mozilla.org/en-US/docs/Web/API/Media_Capab...


At the same time Chromium is not affected:

> Why does this not affect Chromium? Because chromium on aarch64 pretends to be x86_64


They already check the core count via the hardwareConcurrency API[1], check the second screenshot.

[1]: https://developer.mozilla.org/en-US/docs/Web/API/Navigator/h...


Could it not tell if it's actually managing to play and keep up with time? Similarly to how (I believe) it'll degrade if the network is such that it's not loading 4k say quickly enough, it'll switch (if 'auto') to 1080p or whatever.

> Detecting if the device has 20 cores or whatever like he says if possible would be more invasive of privacy.

What he claims though is that they do already do that at least if it's not aarch64:

> Quality 1080 by default. If your machine has 2 or fewer cores, quality 480. If anything ARM, quality 240.


> Could it not tell if it's actually managing to play and keep up with time? Similarly to how (I believe) it'll degrade if the network is such that it's not loading 4k say quickly enough, it'll switch (if 'auto') to 1080p or whatever.

Probably, but it's pretty poor UX if videos start super-choppy by default and slowly downscale to a playable resolution over several seconds. Having a good default is still important.


The test can be during the advert before the video that you want to watch.


Worse than stuttering if it's the network? Anyway, it could go up.


Have to agree with this. Not saying the check should be there, but the majority of non-mobile/server ARM devices are generally underpowered, especially with per-core performance.


they keep serving me 380p by default that I have to change and I've got a gigabit fiber with a modern ryzen and a monster gpu running the latest windows. So whatever they are doing auto quality wise has some screws loose.


They don’t need to detect the number of cores, although hardwareConcurrency is right there, but rather simply to ask the browser what it supports:

https://developer.mozilla.org/en-US/docs/Web/API/Media_Capab...

That way you can fall back for anything too old or weird to support that API, and if the vendor complains there’s a simple response: implement the standard.


That sounds reasonable but it's weird that Chrome lies by pretending to be x86_64. The same performance concerns that apply to Firefox should apply to it.


Virtually all arm devices do GPU decoding they kind of have to.

This was a well understood problem in the 80s if two sides need to establish a commonality one offers options and the other picks. Google controls both YouTubers and the most common browser making it uniquely positioned to handle this well.


Eh, there's pretty famous examples that don't.

For instance, the RPi5 cut out the H.264 decoders versus previous gens, and now only do H.265 decoding in hardware.


> So they are serving lower resolutions to ARM... Doesn't sound like deliberately crippling Asahi Macs to me.

If spoofing the user agent fixes the issue despite being on ARM, then this does sound like a legitimate issue.

> Chrome is not affected even if it claims to be aarch64.


Hmmm, serving low resolution videos to ARM devices running Linux also makes harder for a competitor to build an alternative to Android.


True, but any competing operating system could just falsely report that it is running x86-64 and therefore trick YouTube into serving high-res content.

Disclosure: currently (as a personal project) slowly building an alternative to Android


Can you share it?


[flagged]


Yes I prefer to find the actual facts in a situation and pick sides (if at all) in that situation based on those facts not a predetermined "corporation bad, billionaire bad". Criticism loses all meaning if not based on fact and actually hurts the cause because then even meaningful criticism can be dismissed as baseless.


Gotta be fair about it. There's plenty of reasons to hate Google but this doesn't seem like one of them.


Drives me crazy how regressive Apple is with computing. While it seemed to shake out that GPU AI was the only real solution, earlier this year people were(and still are) putting resources into CPU based.

I can't imagine someone @-ing me because when they were 18 years old and status insecure, someone sold them a corporate identity and feel entitled to maintain their edge case.

I need to stop giving away my stuff for free.


simping for google in order to "own" apple fanboys is just an embarassing position for an adult.


It looks like it's not specific to Asahi or Firefox - YouTube is limiting resolution on any ARM system, but Chrome on ARM lies and claims to be x86.

To be fair, there are a lot of ARM Chromebooks and SBCs that can't really handle 4K video.


Author might have forget that the Asahi is not the "de-facto" Linux in ARM world. Not everything is about them.

Raspberry Pi et. al. has been there many many years.


Sadly I also get these "everything is about us" vibes about Hector sometimes. They seriously should chill a bit.


I have to admit that I had to withdraw my personal sponsoring because some posts are just too mean (e.g. against Apple's speaker team), even if they are very skilled engineers.


The tech community does not have a culture where you can be openly critical about other people's social skills. If you go like "You shouldn't do that", its seen as an attack against their greatness instead of an invitation for self-reflection.


Well if their whole project is reverse engineering stuff that Apple could have just documented, it's understandable how they can get into an "us vs them" mentality.


> stuff that Apple could have just documented

I am not sure if this is valid point. Why would they document publicly something, what is supposed to use just by them?

Their business model goes directly against that. But it is not just Apple but any closed source project. No point on wasting resources to document publicly something which would weaken their own business model. Not saying that it is good for general good, but makes no sense from business point of view.

Anyway, if you document something too well, just same to make it open-source instead.


How many of these devices cater to large screens? IMHO most of such devices cater to small screens, and then 4K is already overkill. Even 720p is overkill on many of the smaller devices.


On my 1080p phone I set the resolution to 4k because 4k + downscaling makes for much better video quality than the butchered blur of pixels that Youtube's 1080p encoder produces.

I would've preferred a higher bitrate 1080p stream, but Youtube only offers that on a limited amount of videos (assuming you have Premium of course).


The Pinebook and Pinebook Pro have 1080 screens and can have hardware video decoding support. I wrote 'can' because the drivers are still in staging for the mainline Linux kernel.


Take a quick look at your phone.


Yeah this is likely done as a shortcut with assumptions made about ARM, but also an example of those that might come back and bite you. ARM is absolutely a moving performance target!


Another Firefox Linux web trouble I've had: Twitch refuses to login, claiming I need to use a supported browser. Then they link to a page saying Firefox is supported. Have this happening on two different computers

https://bbs.archlinux.org/viewtopic.php?id=289645 https://www.reddit.com/r/Twitch/comments/1118xgz/unsupported...

Someone solved it by disabling fingerprinting resistance https://www.reddit.com/r/archlinux/comments/100t2q3/comment/...


The last time I used Twitch it also insisted that my 16 character password was too long for their 40 character limit, so that kind of thing seems to be normal there.


I can’t say I’ve had an issue with Twitch on either macOS nor Arch Linux - both Firefox, no extensions aside containers.

YouTube, however, did stop working on me this morning. It’s now demanding I sign in, which it didn’t before.


The issue on Twitch only pops up when you enable the `resistFingerprinting` flag. If Twitch can't fingerprint you, they simply refuse your attempt to log in.


> Someone solved it by disabling fingerprinting resistance

Similar issue happens also in some Google services, e.g. Google Docs. It does not work at all (or everything is in wrong resolution) if you have fingerprint resistance enabled.


And slack I believe


Maybe using a user agent spoofing plugin can circumvent this? Just a guess though.


It doesn’t specifically target Asahi Linux and/or Firefox but reduces resolution on ARM devices. That behavior and assumptions made through a user agent string may be debated but that’s another topic.


Exactly, the post even mentions that Chromium pretends to be AMD64 to get around the limitation.

In most cases this detection is "good enough", but with Asahi a new edge case has emerged. Ideally the device would be able to tell the server about it's capabilities, but that could be seen as an invasion of privacy and could quickly turn into a new metric for profiling internet users across sites.


> Ideally the device would be able to tell the server about it's capabilities

It is not technically privacy issue if it "tells" about them, by having freedom to tell what it wants.

If the information is mandated with remote attestation and server can force it and otherwise not work properly, then it becomes privacy issue.


I doubt YouTube cares enough for "Asahi Linux" to try and cripple Firefox on it.

It's more like that guy plays the victim card to build some PR.


Yeah, the action is just ‘Google Chrome gets the good settings and the rest can eat dirt’. It’s not specifically Asahi Linux that Google/Youtube is acting antisocially to here.


It’s literally tested and proven?


Interesting that in Mac with Apple chips (M1 Pro) it seems both Chrome and Firefox report them as Intel:

    Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:120.0) Gecko/20100101 Firefox/120.0

    Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36
Edit: even Safari itself:

    Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/17.1.2 Safari/605.1.15



To all the people saying this makes sense, why do they specifically lie for Chromium:

> Why does this not affect Chromium? Because chromium on aarch64 pretends to be x86_64


Looking at other comment threads, that seems to be part of "User Agent Reduction": https://developers.google.com/privacy-sandbox/protections/us...


Didn't Google want to kill user agent based checks and remove it from chrome completely?


I think they wanted to replace it for a capabilities check but I dont think there is an alternative just yet.


For me writing comment replies on YT is completely broken. Since about a week the text input field just doesn't appear. Everything else is there but no place for actually writing something.

It's in Firefox but it happens with Chrome user agent too and it just stopped working without an update inbetween.


> Logic: Quality 1080 by default. If your machine has 2 or fewer cores, quality 480. If anything ARM, quality 240. Yes, Google thinks all ARM machines are 5 times worse than Intel machines, even if you have 20 cores or something.

Can't he just change the default? It's not locked in to a maximum resolution of 240 is it, he could click the video settings and switch up the quality to 1080 or similar.

Seems a bit excessive for him to refer to this as "crippling Firefox" when it only takes a couple of clicks to sort it out.


For me, on Asahi, the 1080p is locked behind premium.


Fun fact: If you go to `about:compat` in Firefox, you can see many of these kinds of fixes, that get automatically applied for you.

e.g. for steamcommunity.com https://bugzilla.mozilla.org/show_bug.cgi?id=1570108

> Add the following UA override for desktop Mozilla/5.0 (Macintosh; Intel Mac OS X 10_14_5) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/75.0.3770.142 Safari/537.36


I remember youtube being shady with firefox for ages. I think there was some kind of famous div being injected into firefox responses that caused some issues. Fuck google.


It’s like the sleazy old Microsoft tricks where every update to Frontpage introduced a little new construct that would ‘happen to’ cause Netscape to crash. Just typical monopolist behavior.


YouTube is crippling Firefox everywhere.

I don't even bother trying to watch YouTube on Firefox, my main browser on Windows. I switch to Chrome for this task.


Apart from live-streaming where you want to see/interact with the chat, why not mpv?


Because starting external programs and copy pasting links is annoying, especially if I have multiple video tabs opened.


Get a plugin for browser User-agent string OS signature manipulation.

It greatly simplifies the bugs in a sea of badly written JavaScript on the web.

Best of luck =)


Never attribute to malice that which is adequately explained by a lazy development team.


Why blame devs? What about a team with management who values shipping over quality?. Gotta hit my KPIs. Of course it's fine to ship the new site without RSS we'll find a solution "later", we just gotta hit the deadline.


From a business perspective, how would you write this? How many developer hours are worth serving better default resolution for 0.01% of devices?


I don't feel like there is a good reason to expose this to websites by default. Just turn on resistFingerprinting and be done with most of this crap.


Asahi Linux might work for me if you could patch the drivers to lower the frequency of high pitched noises an octave or two.


From one of the responses:

> Why are web services like this? Just let the client or user decide, dammit!

1) because YT is paying real $$ to show me videos of cats (especially when I have a bunch of ad-blockers and haven't see a YT ad in 15 years) 2) because "it's your phone dude, you don't need to watch a cat-video in 4k!! your friggin' screen doesn't have 4k to begin with 3) because we can

(the last one.. it's the last one)


user-agent should just be dropped altogether.

For the same reason all browser's user agents are prefixed Mozilla/5.0, all browsers end up all lying anyway.

Let's all machines be an iphone or windows NT/10 device and forget about this shit.


[flagged]


Did you even read the post before you commented? From the post:

> I believe this only triggers on software decoding, which is why it particularly affects us (and not, say, macOS which has hardware decoding).

It's completely sensible for YT to assume an ARM device without acceleration needs lower quality video.


The heuristics used were not sufficient to properly determine the capabilities of the hardware it was running on. Makers can fix this problem by giving users choices.

People like you need to learn that saying things you disagree with does not mean we're stupid or didn't read something. Maybe try understanding someone instead of being a dick. I doubt you have much to contribute to a tech conversation anyway.


I’m far from a Google fanboy. I haven’t owned an Android phone in a decade and I’m holding off switching back to Chrome until the last sensible moment. But Christ, every “YouTube is crippling browser / OS foo” that I’ve ever seen is complete BS. It’s usually someone with an esoteric configuration and a victim complex. This is no exception.


I'm really surprised they use heuristics like this to decide the best video to show.

If I were them, I would make a neural network to predict which video quality a user is least likely to switch away from. Then default to that.

Inputs to the network should be country, detected network speed, CPU speed, browser, screen resolution, etc. also include the users history of changing resolutions - a user who frequently decreases the resolution is probably conscious of bandwidth and therefore doesn't want to default to HD even if their device is capable.

The network can probably be tiny (perhaps just a few thousand weights), so could run on either the client or the server.


TBH I would have thought the best default would be 'what resolution did they choose last time, and get adequate performance?' and you'd only resort to guessing for users with no cookies.


And that is presumably what the network would predict in most cases.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: