Google chooses codecs based on what it guesses your hardware will decode. (iPhones get HEVC, Android gets VP9, etc) They just didn’t put much thought into arm based home devices outside of a specific few like the shield.
My by now rather ancient rk3399 board can hardware-decode both at 4k 60Hz. Which has nothing to do with the fact that it’s aarch64, but that Rockchip included a beast of a VPU (it was originally designed for set-top boxes).
How about, dunno, asking the browser what kind of media it would prefer?
Google chooses codecs based on what it guesses your hardware will decode. (iPhones get HEVC, Android gets VP9, etc) They just didn’t put much thought into arm based home devices outside of a specific few like the shield.
My by now rather ancient rk3399 board can hardware-decode both at 4k 60Hz. Which has nothing to do with the fact that it’s aarch64, but that Rockchip included a beast of a VPU (it was originally designed for set-top boxes).
How about, dunno, asking the browser what kind of media it would prefer?
Why wouldn’t it be my browser asking for the codecs it prefers instead of the website trying to guess my computer’s hardware ?
Lots of hardware lies about its useful capabilities.
Can you run 4k? Of course. But can you run more than 4 frames a second?