What the heck…? My CPU is none of their business.
Google chooses codecs based on what it guesses your hardware will decode. (iPhones get HEVC, Android gets VP9, etc) They just didn’t put much thought into arm based home devices outside of a specific few like the shield.
Why wouldn’t it be my browser asking for the codecs it prefers instead of the website trying to guess my computer’s hardware ?
Lots of hardware lies about its useful capabilities.
Can you run 4k? Of course. But can you run more than 4 frames a second?
My by now rather ancient rk3399 board can hardware-decode both at 4k 60Hz. Which has nothing to do with the fact that it’s aarch64, but that Rockchip included a beast of a VPU (it was originally designed for set-top boxes).
How about, dunno, asking the browser what kind of media it would prefer?
If you use any Google service, everything of yours is their business. You are their product, voluntarily.
this prolly wasnt a bad decision early on… why push something to a population who cant utilize it… but shit changes fast, google.
It seems somewhat damning that Google’s own browser had a workaround for this, though
was it ignorance or malicious intent?
if it was a person, i would try and assume ignorance… im not sure google the company deserves such respect
Or it’s a company so fuckoff huge that one department (Chrome on Android) couldn’t get a bug report escalated in another department (YouTube). Eventually they just put in a UA workaround while the bug rots in a backlog somewhere. Common enterprise bullshit.
Or the Chrome on Android team didn’t even bother reporting the issue to YouTube and just threw in a cheap workaround. Also common enterprise bullshit.
The weirder thing is Firefox on ARM being detected as a HiSense TV. I did a cursory search to see if HiSense ever used Firefox OS on the TV and it doesn’t seem like it. Panasonic seemed to be the only manufacturer using it.
YouTube is having a lot of totally not anticompetitive “bugs” in these past couple of weeks
UA sniffing again? What was it with feature detection and whatnot?
Does this include Apple Silicon Macs? That would be a bold move.
This issue was detected when running Firefox on Linux on Apple silicon. Firefox on Mac just identifies as x64.
It’s probably not on purpose by YouTube. It’s stupid they put restrictions on some heuristics to begin with but maybe because otherwise people would think YouTube is not loading properly while it’s the software decoding on the not capable arm PC that can’t handle the resolution.