Google reminds us why you can’t trust any company with your data

'But the privacy team only found out about the 265.com data access after The Intercept revealed it, and were "really pissed," according to one Google source.' — The Intercept.

While people are increasingly seeing Facebook as a threat to privacy and data security, Google has been fairly immune to the same sentiment. Even after a long series of data leaks and abuses, many still trust Google with their data and see them as almost the flip side to Facebook. It's a testament to just how much good will Google has managed to accrue over the years. But is it reasonable and even rational any more?

The Information, writing about Google's controversial Chinese search engine project, Dragonfly:

Under normal company protocol, analysis of people's search queries is subject to tight constraints and should be reviewed by the company's privacy staff, whose job is to safeguard user rights. But the privacy team only found out about the 265.com data access after The Intercept revealed it, and were "really pissed," according to one Google source. Members of the privacy team confronted the executives responsible for managing Dragonfly. Following a series of discussions, two sources said, Google engineers were told that they were no longer permitted to continue using the 265.com data to help develop Dragonfly, which has since had severe consequences for the project.

Also:

The internal dispute at Google over the 265.com data access is not the first time important information related to Dragonfly has been withheld from the company's privacy team. The Intercept reported in November that privacy and security employees working on the project had been shut out of key meetings and felt that senior executives had sidelined them.

It's a stark reminder that you can't trust any company with your data. The only thing you can trust is a company not having access to it.

Let's contrast this with Apple's policies:

On the topic of Apple product design, Cook reiterated that privacy is integral to the design process. Apple doesn't just think of new things to make, it deliberately makes sure those things are built with privacy and anonymity from the start and throughout the process.

I've heard about some of Apple's privacy team's work in the past, most recently with Face ID. It's safe to say nothing goes forward if Apple can't ensure privacy every step of the way. Don't collect the data unless you absolutely have to. Anonymize and encrypt the data if you absolutely have to collect it. Delete the data as soon as you possibly can.

Instead of "We can't build this product without violating privacy, so let's violate privacy", the company has long held and exemplified that it would choose "so let's not build this product."

In other words, the ends don't justify the means. And when companies tell you who they are, repeatedly, believe them.

Comments are closed.