The problem is that so many browsers leverage hardware acceleration and offer access to the GPUs. So yes, the browsers could fix the issue, but the underlying cause is the way GPUs handle data that the attack is leveraging. Fixing it would likely involve not using hardware acceleration.
As these patterns are processed by the iGPU, their varying degrees of redundancy cause the lossless compression output to depend on the secret pixel. The data-dependent compression output directly translates to data-dependent DRAM traffic and data-dependent cache occupancy. Consequently, we show that, even under the most passive threat model—where an attacker can only observe coarse-grained redundancy information of a pattern using a coarse-grained timer in the browser and lacks the ability to adaptively select input—individual pixels can be leaked. Our proof-of-concept attack succeeds on a range of devices (including computers, phones) from a variety of hardware vendors with distinct GPU architectures (Intel, AMD, Apple, Nvidia). Surprisingly, our attack also succeeds on discrete GPUs, and we have preliminary results indicating the presence of software-transparent compression on those architectures as well.
It sounds distantly similar to some of the canvas issues where the acceleration creates different artifacts which makes it possible to identify GPUs and fingerprint the browsers.
This only sorta works for today and if your friends never share images or videos online. The ever-increasing amount of people taking pictures and filming and posting them online means the day is quickly approaching where you could be identified and tracked through other people’s content, security & surveillance cameras, etc.
If stores start adopting the tracking used at Walmart and the Amazon biometric data, social media will be the last of your worries.
Ahh, Google’s tried and true method of throwing a million half-baked features to people before promptly cancelling them all. This will definitely work for them.
who tf subscribes to this?
I’d bet mastodon saw an increase, but i haven’t seen the numbers.
It’s also hard to get a good count since it’s not centralized. So whatever numbers we do see, could be wildly underreported.
The named email says Abbott’s teams are working to “verify and confirm compatibility”, so it’s unclear if this is an actual issue or just a precaution over what they think could be an issue.
Chrome lost its way years ago. I value not seeing ads or getting personalized content more than I value 99% of the chrome features.
Since Firefox finally fixed that weird memory fragmentation issue, it’s been pretty smooth sailing for me. Inspector & Debugger could use a few performance patches though.
I don’t think we know.
Makes me wonder of the dev team is on a much-needed vacation or if they only run nvidia gpus. lol
Exactly. The only other choices here are to buy used with a risk or wait longer to upgrade.
In this particular case, it’s a bit more complicated.
I suspect the majority of 30x0 & 40x0 card sales continue to be for non-gaming or hybrid uses. I suspect that if pure gamers stopped buying them today for months, it wouldn’t make much of a difference to their bottom line.
Until there’s reasonable competition for training AI models at reasonable prices, people are going to continue buying their cards because it’s the most cost-effective thing – even at the outrageous prices.
Someone in another thread said he has a mastodon account. Dunno if he posts there or not.
Ive used pihole and also just removed the network’s settings.
If you want to stream, i don’t know how useful any of these mitigations are. You’re giving them some data to subscribe and use. Even if you share accounts, who knows what the apps collect.
Many sites have had to enable reveal passwords for people with complicated passwords not using password managers.
It’s low risk, but their numbers are also coming from fairly dated hardware and is just proof of concept. It can almost certainly be speed up significantly.