The Privacy Sandbox Has Always Been a Farce

  Rassegna Stampa, Social
image_pdfimage_print

By deciding to keep third-party cookies in Chrome, Google delivered the latest punchline in its years-long comedy of errors. And no one is laughing. 

As the chief technology officer of an adtech measurement company, I’ve spent years in the trenches of W3C meetings about the Privacy Sandbox, poring over documents and even submitting an API proposal. Throughout, I’ve witnessed a masterclass in corporate hubris. 

But why exactly couldn’t the world’s fourth most valuable company get this right? 

The monopoly elephant 

First, why did Google even bother with the Privacy Sandbox at all? They said they wanted to make “the web more private and secure for users.” In reality, Google was spurred into action by its paradoxical position as both a dominant monopolist in digital advertising and also a laggard in the even bigger race against Apple to be the true heart of consumers’ digital lives (and thus privy to the minute personal details that will power AI agents.) 

Google makes money from search, but people only search when there’s something to find—Google needs a healthy, searchable open web. Web publishers need advertising to make money, and effective advertising requires data. Seeing weakness, Apple seized the marketing moral high ground by loudly attacking Google on privacy

Google needed a way to simultaneously fend off Apple, protect the searchable web and preserve its panoptic view of a billion-plus users across YouTube, Google Search, Gmail and Android.

The flawed foundation 

And so Google appointed itself the arbiter of web privacy standards, unilaterally declared that “privacy” means “preventing cross-site tracking,” and tried to repeat its long tradition of forcing the entire digital advertising industry to play by its rules. 

The core problem was that Google’s definition of privacy was philosophically bankrupt. It ignores key common-sense aspects of privacy, such as the notion that users’ expectations vary depending on what information they’re sharing and with whom (i.e. philosopher Helen Nissenbaum’s concept of contextual integrity). Or that harm comes from how data is used, not how it’s collected.

Pagine: 1 2 3