Types

This is a community post, untouched by our editors.

Screenshot from Eli Pariser's TED talk showing you in relation to the potential information sources around you.

This article is based on my presentation at a conference, held by the Digital Cultures and Theories Research Group (http://digitalculture.hu/)

It seems that your browsing experience is a totally controllable activity – at least if you’re a naive user. You should know by now that your online presence might be tracked by some giant companies which drew the attention of founder of MoveOn.org, Eli Pariser, who has observed some crucial points in surfing the Web. In his book, The Filter Bubble: What the Internet Is Hiding From You, he observed that our searching results are based on previous searches, that is the Web shows relevant all the relevant information it thinks we need. The personalized search results are displayed solely because of different algorithms are at work. Pariser explains that we are all living in ’filter bubbles’, which effectively help us choose what kind of information we need. This might sound promising, but Pariser rather talks about the adverse effect of these personalized Web.

But what are these ’filter bubbles?’. On being asked by Brainpicking’s Maria Popova, Pariser answered (brainpickings.org):

Your filter bubble is the personal universe of information that you live in online – unique and constructed just for you by the array of personalized filters that now power the web. Facebook contributes things to read and friends’ status updates, Google personally tailors your search queries, and Yahoo News and Google News tailor your news. It’s a comfortable place, the filter bubble — by definition, it’s populated by the things that most compel you to click. But it’s also a real problem: the set of things we’re likely to click on (sex, gossip, things that are highly personally relevant) isn’t the same as the set of things we need to know.

Screenshot from Eli Pariser's TED talk showing you and the information that is filtered for you, in relation to the information sources around you.

In a NYTimes article (nytimes.com), he exemplifies that:

But increasingly, and nearly invisibly, our searches for information are being personalized too. Two people who each search on Google for “Egypt” may get significantly different results, based on their past clicks.

In his TED talk, he also broadens the example of ’Egypt’ search, and goes on to describe how our political affiliations can define our search results or even what our friends share on Facebook.

Screenshot from Eli Pariser's TED talk showing you and only the information that is filtered for you, because the rest of the data is deemed non-relevant by intelligent algorithms.

Pariser dates back to 2009 when Google changed  its policy that meant a new dawn of personalization. Google’s algorithms try to ‘foresee’ what interests us, users and these results are not necessarily the popular ones. So, for example if I teach English,  I am more likely to get results from Google that all offer me the chance to study English (though I might have searched for some teaching materials for my classes). Or, take the example of 2010, when Facebook decided to ‘invade’ the Web with the ‘like’ button, this is what they call the Open Graph system. It means that you can ’like’ whatever content you want and it will also be displayed on your Facebook feed. Well, this wouldn’t be a problem, but Facebook can track your page visits even if you are logged out. Pretty scary, huh? This has been the case since 25 September says ZDNet’s Emil Protalinski (here and its update), though Facebook denies these allegations:

“Facebook does not track users across the web,” a Facebook spokesperson said in a statement. “Instead, we use cookies on social plugins to personalize content (e.g. show you what your friends liked), to help maintain and improve what we do (e.g. Measure click-through rate), or for safety and security (e.g. Keeping underage kids from trying to signup with a different age). No information we receive when you see a social plugin is used to target ads, we delete or anonymize this information within 90 days, and we never sell your information.”

Another, more recent example, which I found indispensable to mention is that of the recent F8 conference, the annual Facebook Developer Conference, which took place in San Francisco, where Facebook developers announced some new features, including ‘Timeline’, a sentimental and virtual lifeline from your birth. You can retroactively upload data of your life, your milestone events, if you like. Another new feature is ‘Ticker’ that distinguishes important and less important news in your feed, which can lead us to an uncanny conclusion of how Facebook, instead of us, thinks certain pieces of information are important.

Visual kindly permitted by illustrator: Ilias Sounas, www.sounasdesign.com

These changes seem to conflict with the original idea of the Web’s founding. Sir Tim Berners-Lee, the founder of the Web wrote an article on the Web’s neutrality [http://www.scientificamerican.com]:

The Web as we know it, however, is being threatened in different ways. Some of its most successful inhabitants have begun to chip away at its principles. Large social-networking sites are walling off information posted by their users from the rest of the Web. Wireless Internet providers are being tempted to slow traffic to sites with which they have not made deals. Governments—totalitarian and democratic alike—are monitoring people’s online habits, endangering important human rights.

If we, the Web’s users, allow these and other trends to proceed unchecked, the Web could be broken into fragmented islands.

I am still wondering if Sir Tim Berners-Lee means these ’fragmented islands’ to be Pariser’s ’filter bubbles’? There is a reason why the Web should stay neutral according to Berners-Lee:

Why should you care? Because the Web is yours. It is a public resource on which you, your business, your community and your government depend. The Web is also vital to democracy, a communications channel that makes possible a continuous worldwide conversation.

The thing is that you can rightfully ask the question: is it good or bad to have these bubbles? Well, blogger, Gabriel Weinberg sees it as a wrong dichotomy: it can be good if you are interested in the local cinema shows or pizza delivery, but it can also have bad effects, such as it can ’eliminate’ certain results in your Facebook feeds or browser. As with the English teacher example, the question would be how effective are these algorithms?

So, my conclusion is the following: Pariser has pinpointed a crucial phenomenon in recent Web browsing, he showed us how personalization requires data and also brings about privacy issues, but I believe that, though theoretically and sometimes practically, his idea is an important one, it shows only a part of the problem:

What if I search from another IP address? I buy a new computer? I use private browsing? I use a fake account/multiple accounts? Or if  someone else uses my computer?

What do you think? Let me know!