Last weekend, The Observer carried a story entitled ‘Should we be scared of the made-to-measure internet?‘
I love digital scaremongering. I’ve enjoyed reading about it in all its forms over the years. In fact, the only thing I love more than digital scaremongering is huge, hokey PR pieces in national newspapers hawking books based around the latest media-friendly theories.
The author Eli Pariser’s central complaint is that, as of December 2009 Google has been basing the results of any search you care to enter upon an automatically constructed and ever-evolving model of who it thinks ‘you’ are, and what you tend to like.
The search for ‘you’
Google’s partiality is something that has been known to many of us for years. Heck, we like it so much, we even built an industry around manipulating its results to make our clients more visible. Ladies and gentleman, I give you… PRs and SEOs! Don’t look directly at them, and never, ever feed them after midnight.
However, the issue here is something different; that two people can enter the same search term, and get two completely different sets of results, modelled on Google’s idea of what people ‘like them’ would be interested in.
As Pariser points out, the concept of tailored media designed to communicate views that are automatically pleasing to you is far from new. Fox News, Golfing Channels, The God Channel, are all designed principally to speak to narrow audiences.
Pariser’s issues with Google’s personalisation service revolves around three main complaints:
1. Your ‘filter bubble’ is tailored to just one person. You. There is no ‘shared experience’, as with other forms of tailored media. It really is the unltimate solipsistic experience.
2. You can’t see it, nor determine the accuracy of its assumptions, upon which your net presence is founded.
3. You have no choice in whether or not it operates. It just does.
Going round in social circles
Most of us have, at some point, lost a few hours of the day to cycling through linked networks of our favourite ‘types’ of sites or videos. My guilty corner of the web includes The Daily What, Boing Boing and associated ‘nonsense’ websites that, thanks to the link economy, can refer to each other in an almost endless loop of irrelevance.
Well, Pariser argues, imagine a much bigger version of that happening, only with all the information on the web; with the invisible hand of Google automatically steering you back to the things you like and are interested in.
Why recommend things that are new, scary, or frontier-expanding when you’re far more likely to click around sites Google already knows you like? It’s a kind of digital navel-gazing, whereby we’re all doomed to move in ever-decreasing circles of our own self-referential guff, becoming ever more limited in our views and less exposed to interesting or unpleasant new ones.
Searching for the truth
Whether Eli Parser has simply seized on the latest clever-sounding scare-story for the chattering classes, or actually has a point, remains to be seen. All I’ll say is this; my private-time news-media consumption has never been narrower or more chinese-walled than it is today.
And all those predictions about the net limiting our attention spans seem to have come true, haven’t they? Hell, you probably haven’t even made it this far.
The real question I think this poses; will unbiased search results become the new ‘net neutality’ debate? If so, it’s in danger of being equally stillborn.
In any case, an emerging understanding of how Google returns these results would ultimately be of benefit to the industry, allowing us to better understand how to give our clients the best search-returns on the widest range of individual profiles possible. From this point of view, greater transparency will encourage better engagement.