According to the author of ‘The Filter Bubble’, Eli Pariser, technology is what the 21st century is about along with how it controls our attention. This book is specifically dedicated to what Pariser coins the ‘filter bubble’- where certain information on the internet is kept invisible which deters us from learning about things we do not know. Chapters range from how our information and data is gathered, stored, filtered and shared on the Internet to the applications of search algorithms that enable targeted marketing and advertisements. He also warns us about the future of this online world, as well as addressing the potential benefits and the creation of a civil society, whilst mapping out the history of the press and journalism in regard to freedom of speech.
Pariser introduces some unfamiliar vocabularies to this reader: pervasion profiling, Acxiom, Adderall society, Syncophantic personalization, flash crash, website morphing, AugCog, identity loops, digital environmentalists, besides contextualising concepts and terms such as: responsible journalism, sentiment analysis, Augmented Reality, the tyranny of default and conformation bias. He even elucidates how Google trained its translation software by applying falsifiability, with a nod to Karl Popper and the black swan fallacy. But this book is specifically dedicated to what Pariser coins the ‘filter bubble’- where things on the internet are hidden because of ‘personalization’.
The day the world changed was December 4, 2009 when Google announced its personalized algorithms. These algorithms have been so far developed that they are able to gather, extract, filter and monitor our online behaviour, leading to personalization, the present currency in the online marketing of our data. According to Pariser this leads to the ‘distortion effect’- one of the challenges posed by personalized filters. ‘Like a lens, the filter bubble invisibly transforms the world we experience by controlling what we see and don’t see. It interferes with the interplay between our mental processes and our external environment. In some ways it can act as a magnifying glass, helpfully expanding our view of a niche area of knowledge. But at the same time, personalized filters limit what we are exposed to and therefore affect the way we think and learn. They can upset the delicate cognitive balance that helps us make good decisions and come up with new ideas. And because creativity is also a result of this interplay between mind and environment, they can get in the way of innovation. If we want to know what the world really looks like we have to understand how filters shape and skew our view of it.’
Pariser not only explains how personalization works, but states that it is changing our experience of news and economics that determine how stories get produced. If ‘journalists should worry about content, then technology should worry about bringing it to the right group, what’s the best set of eyeballs for it, which are solved by personalization.’ He begins with comparing older and new media, social media and how we get and interpret news. Drawing on the Latin roots of media (medius) as something that is in the middle, media and articles that are spotlighted today are based on the analysis of search queries and how people approve their content. The ‘attention economy’ is ‘applying the binding and the pages that get read and the pages that are frequently the most topical, scandalous and viral’. As your identity shapes your media, media also determines your identity - who you become. This is what Facebook tends to do, adding to the creation of only one personality or identity, which is where we are moving towards on the internet.
Furthermore, the media of the middleman disappeared with ‘disintermediation’ in the decade of the 2000’s. The new curators of today are Google, Amazon, Craigslist, Facebook. These ‘gatekeepers’ are determining the content on the internet, what we decide to pay attention to, whether those be recommendations from friends or cultivated by experts (humans), or that they are decided by code, algorithms. Most people are grateful to get such relevant issues. But is it really so beneficial if you have only the views and reinforcement from those who agree with you?
Referencing John Dewey Pariser emphasizes that communication builds a stronger community and that the filter bubble, i.e. personalization does not lead to collaboration. It also inhibits creativity because real invention comes from what’s off to the side, not mainstream or repeated. Creativity is then thinking outside of the box, curiosity brings unknown things into view and combines disparate ideas. ‘Learning is by definition an encounter with what we don’t know, what we never understood or entertained as possible.’ Tied up within this is the whole idea of serendipity that plays a crucial role in creativity and why we need to be able to control our search and not be slaves to personalization. Pariser’s point then is that personalization has given us an online public sphere that is manipulated by algorithms.
The book also demonstrates how social media works and how third parties control our consumption behaviour through marketing schemes based on personalization, which encourages answers but does not stimulate questions. Pariser leaves us with suggestions of how we might better get a grasp on the power of personalization, or to alter its effects like ‘crafting an algorithm that prioritizes falsifiability… an algorithm that aims to disprove its idea of who you are.’ We are giving away our data for free and instead need to protect our data and have control over it. He closes out by advocating ‘digital environmentalists’, thereby protecting the early vision of radical connectedness as user control, as an urgent priority for all of us.
‘The Filter Bubble’ reminds us that the information age not only spews data but creates a sense of deprivation. What it doesn’t discuss enough is how the new technologies of post-Fordism invest in human subjectivity through social networks and user-generated content, and therefore differ from the industrial era. This shift from alienation to participation brings with it the marketing of affect and subjective expression, automated by rankings and personalization. Nor does this book speculate on the dialectic between human curators and software curators and the consequences that eventually code will replace human filters. Although he does not address the Turing test (can machines act like humans) directly, Pariser points out that the ‘human ability of walking the tightrope, to adjust to the demands of different environments and modalities is one of human cognition’s most astonishing traits.’ Artificial intelligence is not even close, so not to worry, the computer hasn’t caught up to humans yet but it’s amazing what they can do. And ‘as technology gets better and better at directing our attention, we need to watch closely what it is directing our attention toward.’