Conall

The Costs of Zuckerberg’s Digital Odyssey

I know I’ve made some very poor decisions recently, but I can give you my complete assurance that my work will be back to normal.

HAL: 2001: A Space Odyssey

 

Last week Facebook CEO Mark Zuckerberg was on the Congressional hot seat, answering questions related to the Cambridge Analytica data breach. Many seemed shocked and appalled. “How could they sell my data?”

 

“How could they?” My smartass reaction: “Well what did you expect?” Should we be surprised that Facebook or Google or any internet enterprise is selling our data?

 

To most of us, social media apps are like a kind of magic. Install them and bang, you’re connected. We rarely think of the stuff behind them; data warehouses filled with countless servers, underground cables, towers and satellites. It’s engineers and programmers and algorithms. That’s a lot of expensive stuff and talent.  How much do we pay for it?

 

Zero!

 

A Digital Utopia

Facebook, like Google or Twitter seem to have invited us into this big, digital utopia. Reading the first line of Facebook’s Data Policy statement; “We give you the power to share as part of our mission to make the world more open and connected.” From day one, digital gurus, from Steve Jobs to Bill Gates to Tim Berners-Lee to Mark Zuckerberg have discussed a democratic digital utopia where we are all connected. And we’ve fallen under their spell.

 

 

I remember when I first consciously encountered the digital magic. I had signed up for iTunes and bought an album. iTunes made recommendations for other albums I might like. The recommendations were good. iTunes really seemed to understand me and my musical taste. Was it magic? No – Apple was simply aggregating purchase patterns or data to predict my interests. And guess what – I’m predictable.

 

Algorithm’s Know

Every social media algorithm works on a similar basis. It collects data points, identifies patterns and extrapolates that information to anticipate probabilities.  In his fascinating and scary, Homo Deus, A Brief History of Tomorrow, author Yuval Noah Harari talks about how researchers studying the Facebook algorithm found that based on 300 likes, Facebook could predict a person’s answers on a personality test better than their spouse.  Hmm. Flowers or chocolates?  Should I just ask Facebook or Siri or Alexa?

 

So, why the moral outrage over Facebook? I think it’s the fear that an algorithm knows us far better than we can imagine, and that certain actors or agents can use that knowledge to persuade us to act or buy or vote in a way that isn’t us.

 

The Elephant in the Brain

Can they? Probably. According to Kevin Simler and Robin Hanson in their book, The Elephant in the Brain, we’re not very good at understanding our own motivations. Any third party, or algorithm with enough data points can likely predict our responses better than we can predict our own actions. Simply, there’s a big difference between what we say we’ll do – and what we’ll actually do. And technology tracks what we do.

 

The Dark Cost

All of that interaction that you do online, all of those likes and comments and purchases have a value to someone – to advertisers, to anyone in the persuasion business.

 

You see their ads online, on Facebook, on Google. You know they’re there. Somehow we’ve convinced ourselves that those ads don’t have an effect on us. But those are the ads we recognize. What about the ads that don’t even seem like ads, the messages or articles that are recommended by friends?

 

That is the dark cost of the internet, the ability to analyse, predict and persuade individuals in a way that all feels natural. The scary thing is that maybe Facebook and Google and any number of social apps know us a little too well. That is power. Power that can be used for good or ill. That understanding is what we’re all waking up to.

PREV

"I'm Fine" Could be one of the Deadliest Things you Say

NEXT

Smartphones are CRACK. I needed REHAB. Extreme Unplugging for 58 days*