this post was submitted on 21 Jul 2023
927 points (100.0% liked)

Technology

37717 readers
400 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

The much maligned "Trusted Computing" idea requires that the party you are supposed to trust deserves to be trusted, and Google is DEFINITELY NOT worthy of being trusted, this is a naked power grab to destroy the open web for Google's ad profits no matter the consequences, this would put heavy surveillance in Google's hands, this would eliminate ad-blocking, this would break any and all accessibility features, this would obliterate any competing platform, this is very much opposed to what the web is.

you are viewing a single comment's thread
view the rest of the comments
[–] heliodorh@beehaw.org 33 points 1 year ago (2 children)

I'm a non-techie and don't understand half of this, but from what I do understand, this is a goddamn nightmare. The world is seriously going to shit.

[–] JVT038@feddit.nl 54 points 1 year ago (4 children)

My ELI5 version:

Basically, the 'Web Environment Integrity' proposal is a new technique that verifies whether a visitor of a website is actually a human or a bot.

Currently, there are captchas where you need to select all the crosswalks, cars, bicycles, etc. which checks whether you're a bot, but this can sometimes be bypassed by the bots themselves.

This new 'Web Environment Integrity' thing goes as follows:

  1. You visit a website
  2. Website wants to know whether you're a human or a bot.
  3. Your browser (or the 'client') will send request an 'environment attestation' from an 'attester'. This means that your browser (such as Firefox or Chrome) will request approval from some third-party (like Google or something) and the third-party (which is referred to as 'attester') will send your browser a message, which basically says 'This user is a bot' or 'This user is a human being'.
  4. Your browser receives this message and will then send it to the website, together with the 'attester public key'. The 'attester public key' can be used by the website to verify whether the attester (a.k.a. the third-party checking whether you're a human or not) is trustworthy and will then check whether the attester says that you're a human or not.

I hope this clears things up and if I misinterpreted the GitHub explainer, please correct me.

The reason people (rightfully) worry about this, is because it gives attesters A LOT of power. If Google decides they don't like you, they won't tell the website that you're a human. Or maybe, if Google doesn't like the website you're trying to visit, they won't even cooperate with attesting. Lots of things can go wrong here.

[–] arthur@lemmy.zip 38 points 1 year ago* (last edited 1 year ago) (1 children)

And the attester will know where you're navigating, always.

[–] Lowbird@beehaw.org 19 points 1 year ago (1 children)

It sounds like VPN's would also get flagged as bots? Or could easily be treated as such.

[–] floofloof@lemmy.ca 25 points 1 year ago* (last edited 1 year ago) (2 children)

They could get rid of ad blockers, anonymity, Tor, VPNs, Firefox, torrenting sites, independently hosted websites, open-source servers and non-Google Linux clients all in one go. It would be a corporate dream come true.

Or we could stop using their tools and services and fork off the internet run for people from the internet run for profit. It doesn't need to be big or slick; it just needs to be there.

[–] Senex@reddthat.com 12 points 1 year ago

I like the idea of Internet 2.0. Kinda like what we are doing here on Lemmy. Corporate ruins it, we build it anew!

[–] Tau@sopuli.xyz 3 points 1 year ago

There are even alternative root-servers so we can even escape from the TLD hell

[–] HarkMahlberg@kbin.social 19 points 1 year ago (2 children)

Your final paragraph is the real kicker. Google would love nothing more than to be the ONLY trusted Attester and for Chrome to be the ONLY browser that receives the "Human" flag.

[–] will6789@feddit.uk 11 points 1 year ago

And I'm sure Google definitely wouldn't require your copy of Chrome to be free of any Ad-Blocking or Anti-Tracking extensions to get that "Human" flag /s

[–] jarfil@beehaw.org 2 points 1 year ago

Too late.

Microsoft, Apple, and most hardware manufacturers have been the ONLY trusted attester on their own hardware for years already.

Also Microsoft on most PCs.

[–] heliodorh@beehaw.org 3 points 1 year ago

Appreciate you and everyone else who broke this down for me!

[–] jarfil@beehaw.org 2 points 1 year ago
  1. You open an app...

The rest already works like that.

You can replace Google with Apple, Microsoft, any other hardware manufacturer, or any company hardware attestation software.

[–] ricecake@beehaw.org 7 points 1 year ago

So, a lot of the replies are highlighting how this is "nightmare fuel".
I'll try to provide insight into the "not nightmare" parts.

The proposal is for how to share this information between parties, and they call out that they're specifically envisioning it being between the operating system and the website. This makes it browser agnostic in principle.

Most security exploits happen either because the users computer is compromised, or a sensitive resource, like a bank, can't tell if they're actually talking to the user.
This provides a mechanism where the website can tell that the computer it's talking to is actually the one running the website, and not just some intermediate, and it can also tell if the end computer is compromised without having access to the computer directly.

The people who are claiming that this provides a mechanism for user tracking or leaks your browsing history to arrestors are perhaps overreacting a bit.

I work in the software security sector, specifically with device management systems that are intended to ensure that websites are only accessed by machines managed by the company, and that they meet the configuration guidelines of the company for a computer accessing their secure resources.

This is basically a generalization of already existing functionality built into Mac, windows, Android and iPhones.

Could this be used for no good? Sure. Probably will be.
But that doesn't mean that there aren't legitimate uses for something like this and the authors are openly evil.
This is a draft of a proposal, under discussion before preliminary conversations happen with the browser community.