Showing posts with label Eli Pariser. Show all posts
Showing posts with label Eli Pariser. Show all posts

Wednesday, 14 September 2011

How Google reads your mind

Regular readers of this blog may recall my mentioning Eli Pariser's work a couple of times. My review of Pariser's book  The Filter Bubble: What The Internet Is Hiding From You has just been published in the autumn edition of 'The Author', the magazine of The Society of Authors. I have reproduced the review below for your interest.




On the face of it, the internet, search engines, and Google in particular have been a great boon to writers and others who do research as a necessary part of their work. Indeed, who would deny it? Remember the old days when you camped yourself in the library for hours on end, leafing through books that often delivered scant reward – insufficient or out-of-date information – or fiddled with spools of microfiche for that elusive news article? No question, we should be duly grateful for the luxury of sitting at home (before the same machine that we use to set down our work) with the ability to trawl the world’s resources rapidly and at no great expense. But are we being allowed to make the most of this wonderful opportunity?



If, like me, you had assumed Google’s sophisticated page rank algorithm was offering you the best, most authoritative, most useful selection from the vast sources of information it holds, then you need to be aware of the corrective presented in Eli Pariser’s new book The Filter Bubble. The nub of Pariser’s argument is that Google, far from facilitating an expansion of our world view in proportion to the expansion of available resources, is in fact limiting us by dint of our previous choices. We are being drip-fed indoctrination by our own ideas.



Pariser has upended my assumption that the spread of results I’d get from typing a query into my Google search box would be exactly the same as... yours, for example, provided we both entered the same search terms. Not so. Apparently Google uses 57 signals – ranging from where the browser is located to what items one has searched for before – to decide what site-links it is going to offer a particular user. The process is concentric because the more we access the internet for everything from information gathering to purchasing the more Google gets to ‘know’ about us individually, and the more bounded each of us becomes by the range of options it presents that are in easy reach. It’s a systematic process of personalisation by which Google builds a theory of identity based on ‘you are what you click’.



As a profit-seeking organisation, Google is ultimately not in the business of providing us with the best information; it is in the business of delivering us to its advertisers, sponsors and funders – the ones who pay the piper. The most obvious example is the sponsored links that appear on the top of one’s results page. Perversely, I avoid these, but Google has much more subtle ways of using the data I have previously provided to get me to places based on its commercial imperatives rather than my intellectual curiosity or professional need. Because Google’s behaviour marketing is invisible, and because I’m generally unaware that my choices are being made for me in this way, I am off guard to a degree that I would not be if I was, say, reading a newspaper with a known political viewpoint, or speaking to a salesperson who I know has a vested interest in selling me her wares. Google can fool and flatter me into thinking I’m making the decision without such influences, and that makes me (and you) the greater fool.



Of course it’s not only Google doing it. Amazon, iTunes and many others are sirens of sycophancy, calling our names and casting out ‘recommendations’, drawing us through filters into a subterranean model of our world where increasingly what we see are distorted or caricatured versions of our own reflection inside a grotto of product.



As information-gatherers we may be grateful for a shortcut through the morass of available data. As consumers we may welcome some guidance to our purchases based on known preferences – it can save time and effort, and help us get to things we might want or didn’t otherwise know existed, though it also reduces the opportunity for serendipitous pleasures – but as Pariser argues, what might be good for consumers is not necessarily good for citizens.



Nor are we reaching out fully to the global promise of the internet, which has the potential of widening our horizons, stimulating creativity, and putting us in instant touch with treasures of learning previously unobtainable. Pariser’s frustration with the way centralised entities have lassoed this promise is evident and well articulated in his thought-provoking book. His disappointment must be shared by all internet idealists who believed that information could be freed by the new technology, only to learn that it has merely been transferred to savvy opportunists who, appreciating the old adage that information is power, have steadily, almost invisibly, salted it away for redistribution on their own terms.



Tuesday, 19 July 2011

Are we Liking ourselves into La-La Land?

In an earlier post I referred to Eli Pariser’s new offering The Filter Bubble: What the Internet is Hiding From You. (Look out for my review of the book in the next edition of The Author). Last time, I focused on Google’s mission to build a theory of identity for each user based on ‘you are what you click’, and how that tends over time to narrow rather than widen one’s choices. Today, following another of Pariser’s themes, I want to focus on Facebook and its alternative theory of identity: ‘you are what you share’, and how that leads to promotion of the trivial but entertaining at the expense of the serious but important.





Unlike Twitter, which has famously been used to rally protest, to aggregate political concerns and to promulgate initiatives - and which often aims to reach out to an audience beyond family, friends or fans – Facebook is almost exclusively social lite. It’s typically fluffy and cuddly, or sassy and bantering, engaging in what linguists call phatic interaction, with the emphasis on the social rather than the informational aspects of communication; and the images exchanged on Facebook are generally supportive of that social purpose. As Pariser says rather misanthropically: ‘The creators of the Internet envisioned something bigger and more important than a global system for sharing pictures of pets.’



But the fact is many millions of us are using the medium for just that sort of activity, and even more so now that mobile, hand-held and handy devices are becoming common. If we are what we share, then what we are sharing is on the whole pretty frothy stuff.

      

By making it fun and easy to do, the Facebook providers have encouraged us to entertain each other in this way, and some would argue they have added to the store of human happiness and fellowship as a result. Maybe so. But have they tilted our attention away from some of the harder realities of life? Are we in danger of becoming like the Eloi in H G Wells' TheTime Machine frolicking like children in the sunshine, unwary of the Morlocks waiting in the shadows, or rather in denial of them and the threat they pose?



Pariser points up one neat little device that may be contributing to a skewed, rose-coloured view of the world. Facebook has made it possible to press the Like button on any item on the Web. With one quick click we can let our Facebook friends know what we are enjoying, and by the same action we increase the likelihood of that particular item being seen by others, because our Liking it improves its ranking.





Now, what sort of thing are we likely to be Liking? Or, to put it the other way round, what are the stories that would seem inappropriate to Like? To use Pariser’s examples: ‘It’s easy to push Like and increase the visibility of a friend’s post about finishing a marathon or an instructional article about how to make onion soup. It’s harder to push the Like button on an article titled, “Darfur sees bloodiest month in two years.”’



The Facebook team that developed the Like button originally considered a number of options, including stars and a thumbs-up sign (rejected as a stand-alone because it’s an obscene gesture in some countries); they even considered Awesome, but chose Like eventually because it was more universal. That apparently minor design choice may have had major unintended consequences, for it is has almost certainly determined that we push the button on stories that are more friendly, less challenging, more emotional perhaps but less troubling, more likeable. So these are the stories that get more attention on the Web and subtly, steadily alter our world view. Like the Eloi, we prefer to face the sunshine.



Pariser asks us to imagine that next to each Like button on Facebook was an Important button. You could tag an item with either Like or Important or both. This one simple development could be a very useful corrective, could help to restore the balance to a certain degree. Not entirely, for it seems to be part of our nature to look for the things we are likely to enjoy - the entertaining, the humorous, the titillating. We will always want to share gossip and to seek out the stories of celebrity, of scandal and success. But we need to be aware of what else is around us, and to share that too. We cannot ignore those things that are important to sections of humanity who may not be part of our immediate social network (our comfort zone), because we can be sure that one day soon those things we've chosen to ignore will force their importance on us.




Thursday, 16 June 2011

Google is a whirlpool looking for a new fool

On the face of it, the internet, search engines, and Google in particular have been a great boon to writers and others who do research as a necessary part of their work. Indeed, who would deny it? Remember the old days when you camped yourself in the library for hours on end, leafing through books that often delivered scant reward – insufficient or out-of-date information – or fiddled with spools of microfiche for that elusive news article? No question, we should be duly grateful for the luxury of sitting at home (using the same machine that we use to set down our work) with the ability to trawl the world’s resources rapidly and at no great expense. But are we being allowed to make the most of this wonderful opportunity?



Until recently, I had assumed that Google’s stupendous page rank algorithm was giving me the best, most authoritative, most useful selection from the vast sources of information on offer, provided I got my search terms right. That was before I became aware of the argument presented in Eli Pariser’s new book The Filter Bubble which reveals that Google, far from facilitating an expansion of my world view in proportion to the expansion of available resources, is in fact limiting me by my own previous choices. Pariser calls this a bubble; I prefer to think of it as a whirlpool, sucking me into its ever-decreasing circles.



Until recently, I had assumed that the spread of results I got from typing a query into my Google search box would be exactly the same as another’s results, provided we both entered the same search terms. It has taken Pariser to explain to me that this is not true. Apparently Google uses 57 signals – ranging from where the browser is located to what items I have searched for before – to decide what site-links it is going to offer me. The process is concentric because the more I use the internet for everything from information gathering to purchasing the more Google gets to know about me, and the more bounded I become by the range of options it presents that are in easy reach.



Moreover (and this I suppose I did know, but never really thought through the implications) Google is not in the business of providing me with the best information; it is in the business of delivering me to its advertisers, sponsors and funders – the ones who pay the piper. The most obvious example is the sponsored links that appear on the top of one’s results page (perversely, I avoid these); but Google has much more subtle ways of using the data I have previously provided to get me to places based on their commercial imperatives rather than my intellectual curiosity or professional need. Because their behaviour marketing is invisible, and because I’m generally unaware that my choices are being made for me in this way, I am off my guard to a degree that I would not be if I was, say, reading a newspaper with a known political viewpoint, or speaking to a consultant whom I know has a vested interest in selling me his product or service. Google can fool me into thinking I’m making the decision without such influences, and that makes me the greater fool.



And of course it’s not only Google that’s doing it. For me, Amazon is another major whirlpool, as are iTunes and a number of others whose filters and ‘recommendations’ are drawing me into a subterranean version of my world where increasingly what I see are distorted or manipulated versions of my own reflection.



As a consumer, I may welcome some guidance to my purchases based on my known preferences – it can save time and effort, and help me get to things I might want or didn’t otherwise know existed, though it also reduces the opportunity for serendipitous pleasures – but as Pariser argues, what might be good for consumers is not necessarily good for citizens.



Nor does it make the best use of the global promise of the internet, which has the potential of widening our horizons and putting us in instant touch with treasures of learning previously unobtainable. It’s a misuse of the most powerful instrument of our age. We have embarked on a world-wide adventure but  we have put ourselves in the hands of a navigator we can’t really trust. There are oceans to explore, but the more we sail on them, the greater the perils of the whirlpool.