Me and the Algorithms
My wife laughed at me.
Why? Well, I’d mentioned reading an article online about an animation company then I went to another newspaper site and found an ad for that same animation company in the side bar.
“Wow,” I thought out loud, “what serendipity.” That’s when she laughed.
Geek that I am, I know that Google tracks our online behaviour and then provides that information to marketers so they can provide targeted advertising . I knew they did this to other people, but I just didn’t really think they did it to me.
There’s something inherently disturbing in the idea that an algorithm can predict your behaviour. We’re all unique individuals. Sure, I may be part of a group, but I’m still unique, right?
We hear a lot about “big data” these days. Through big data, we can tailor information delivery to a user and filter out all of the stuff that person isn’t really interested in. It’s way more effective than marketing the same thing to everyone. But do we want to be categorized by advertisers, corporations, or governments? Do you want to be labeled a conservative, liberal or dissenter? How could that information be exploited? Is Big Brother knocking at the door?
These are the questions that are being raised by whistleblowers like Edward Snowden. Our governments are listening. Is it for nefarious purposes? Probably not. A part of collecting information is to figure out efficient ways of delivering services. But, how do we know?
A part of me loves algorithms. When I first got iTunes a few years ago and began buying digital downloads I loved the recommendations iTunes delivered. Based on my choices, iTunes would suggest other listeners’ recommendations. These choices were amazing. It opened up a new world of music to me. Musicians that I would have never heard of were now added to my playlists.
Now you see these services everywhere: on Netflix, Amazon, dating sites, and even on our own Volunteer site, GetInvolved.ca.
So what’s the problem? I think a part of it is prudishness. We’re sensitive about systems knowing our health history, sexual behaviour, and private thoughts. In a digital world, are we allowed to have private thoughts? Wouldn’t it be better to aggregate those private thoughts so systems could learn to maximize messages for our private selves? Starting to feel like you’re getting into a circular argument?
I think that at the heart of my issue with algorithms is the fact that they’re closed systems. They reference existing data. It’s a bit like the 60’s British show, The Prisoner. The lead character is trapped in a thoroughly pleasant British town, but can’t seem to escape.
Sure algorithms provide us with all these great choices, but you have to wonder if they erect digital walls around our thoughts and actions. How can they provide us with outside-the-box thinking? Simply put, where will we find serendipity in a big data world?
Here is a link to a blog on privacy from our series PULL.
Curious to know the Top 10 algorithms that have entered our lives: The 10 Algorithms That Dominate Our World