Filter bubble

The filter bubble (English filter bubble) or information bubble (English informational bubble) is a term that was coined in his book of the same Paris from Internet activists Eli. He thus describes the phenomenon that websites use specific algorithms to predict what information the user may be relevant, based on the available information about the user - for example, the user's location, search history (English searchhistory or web history) and click behavior.

By applying these algorithms websites tend to show the user only information that match with the recent views of the user. So the user will very effectively in a "bubble" in isolation, which tends to exclude information that contradict the previous views of the user.

A prime example of this is Google's personalized search results and the personalized news stream from Facebook. After Parisian opinion of users is less "loaded" by opposing views, and thus intellectually isolated in an information bubble.

Paris brings an example in which a user on Google searched with the keyword " BP " and news on investment opportunities by British Petroleum received while another user with the same query got information on pollution caused by the Deepwater Horizon oil spill - hence the two searches so brought completely different results. This insulating bubble effect can have negative consequences for the discourse of civil society, says Paris. However, there are opposing views that say that the effect is minimal and manageable.

Concept

Personalization can be defined as follows:

After Paris, for example, Google uses various " signals " (earlier search keywords, location, status updates from contacts on social - networking sites, etc.) for search results to the user adjust. The other hand, Facebook observed the interactions of a user with other users and filters Posts by certain users. This means user activity ( click history) are translated into a single user identity and on the basis of this identity certain information to be filtered out. Facebook used for the so-called EdgeRank algorithm.

Paris describes his concept of the filter bubble with the more formal description: ". 's Personal information ecosystem that is created by these algorithms " Other terms have been used to describe this phenomenon as " ideological framework " (English Ideological frames) or " figurative sphere that surrounds you when you're looking for on the Internet ." The last search history is collected in the course of time, when an Internet user interest in certain topics shows by clicking on the corresponding links, visited pages of friends, corresponding films placed in the queue and make headlines reads, etc. for the collection and analysis of these data using the website operator often tracking services such as Google Analytics. Internet companies then use this information to tailor advertising to the needs and of the specific user's taste or to accommodate the relevant ads to a more prominent position in the search result.

Parisian's concern is similar to the concerns expressed in 2010 by Tim Berners -Lee for the Hotel California effect ( about: in it goes, but no more out ), which occurs when social networking sites block out and lock out content from other competing sites, so to have a larger share of net community in their network. The more you enter, the more one is trapped and bound to the information within a specific website. It is concluded for " concrete bunker " and there is a risk of fragmentation of the World Wide Web, Tim Berners -Lee says. For example, users of Facebook are "trapped" in a way, for always being there. If they should decide at some point to leave the network, their user profile is disabled, but not deleted. All of their personal information and the log of all their activities on Facebook to stay forever get on the servers of Facebook. So they can never completely leave the Facebook page.

In his book The Filter Bubble warns Paris that a potential disadvantage of the filtered search is that they " are excluded from new ideas, themes and important information " and " creates impression that only those things exist that knows our narrow self- interest," the. This is his opinion, potentially harmful, both for the individual and for society. He criticizes Google and Facebook that they offer content based on " too many sweets and not enough carrot ." He warns that "we are only limited new information exposed and our vision is concentrated, in that " invisible algorithms edit the editorial network. " Paris thinks that the adverse effects of the filter bubble also bring extensive disruption to society in general, in the sense with him that this might " of civil discourse is undermined " and people are susceptible and vulnerable to " propaganda and manipulation ".

He writes:

Reactions

There are conflicting reports on the extent to which personalized filter will be applied and whether filters for the users is beneficial or rather brings disadvantages.

The analyst Jacob Weisberg, who writes for the online magazine Slate, led a small, unrepresentative experiment to verify Parisian theory: Five people with different political attitudes searched the internet for the exact same search terms. The search results were almost identical in four searches on all five people. From this he concludes that there is no filter bubble effect and that consequently this theory, the filter bubble is an exaggeration, what we are all "at the feeding trough of the Daily I (English" " be fed Daily Me) ". For his book review Paul Boutin undertook a similar experiment with people with different search history. He came to a similar conclusion as Weisberg: almost identical results. The Harvard computer science and law professor Jonathan Zittrain doubted the extent of distortion of the search result that reaches Google with its personalized filter. He says that the impact of personalized search on the search results is low. In addition, there are reports that the user can bypass the Google Personalized Search, if he wants, by deleting the search history or using other methods. A spokesman for Google said that additional algorithms have been integrated into Google's search engine " to limit personalization and to increase the diversity of the search results " to the. To always protect yourself by website operators before tracking, the user can install, add- ons in the browser, such as Mozilla Firefox Adblock Plus below.

Nevertheless, there are reports that Google and other search engine providers have large amounts of information that could put them in a situation in future the "Internet experience " of the user to further personalize if they choose to do so. A report suggests that Google even then you can pursue the former surfing habits of the user, if he has no personal Google Account, or is not logged in to their Google account. Another report says that Google tons of collected data have - to the extent of 10 years - who come from various sources, such as Gmail, Google Maps and other Google search engine along with the actual services provided. However, this is contradicted by a report that the attempt to personalize the Internet for each user represents a major technical challenge for an internet company - despite the huge amounts of available web data about the user. The analyst Doug Gross of CNN said that the filtered search for Consumers seem to be more useful than for citizens. It helps the consumer who is looking for "pizza" to find local delivery options, which, appropriately, quite distant Pizza service providers will be filtered out. There are consistent reports that Internet sites like the Washington Post, the New York Times and others seek personalized information services to build. These work on the principle, so tailor search results to the user that they probably like him or he's at least agree with them.

One article deals more specifically with the problems of electronic filters. Thereafter, the user has no influence on the criteria used to filter by. The finding evaluated by Google signals the situation is similar: The user learns neither which of these data are still used as they may change. In addition, any lack of transparency. The user knows neither how filtered, nor that is ever filtered. Due to the large amount of information on the Internet, however, filtering mechanisms are indispensable. Personalization is seen as the main problem of the electronic filter: The weighting of the information is customized to the user. This does not have the ability to filter on or off and control by self-determined criteria. Finally requires Parisian of the great filters, such as Google, Facebook and Co., transparency and control by the user. A group of researchers at the University of Delft recommends makers of filtering technology to take greater account of autonomy and transparency for the user.

Critics consider the thesis of the filter bubble for a statement from a false perspective. Given the flood of information there is no alternative to filtering techniques. Selection of information have always taken place and that it was a necessary consequence, that other information will not be selected. The internet would otherwise make abseitige discussions easily accessible by creating digital spaces and easily accessible 'm doing. In addition, the theory is naive, because content is not easily filtered or unfiltered is, but processed by many actors in many ways, enriched or will move.

Paul Resnick, Professor at the University of Michigan, summarizes the discussion about the filter bubble up as follows: Personalization should be evaluated not per se bad. In his view, accurate personalization is of less concern than not to personalize or inferior to personalize. Filterer have power and therefore public responsibility towards. The duties of Filterern he includes in particular, to carry out any hidden personalization and not unilaterally manipulate personalization.

Better Personalization

Paul Resnick makes for a better personalization following suggestions:

  • Multidimensional preferences:
  • Optimize the ratio of exploration of user interests and preferences and commercial exploitation.
  • Portfolio preferences:
  • Delayed preference indicator:
  • Impulse toward long-term preferences:
  • Common reference point feature:
  • Features that play a perspective:

Researchers at the Delft University of dealing with ethical issues of personalization and worked from about following non-binding proposal:

Similar concepts

Relevance paradox

The concept of the filter bubble is similar to another phenomenon as relevance paradox (English relevance paradox ) is described. Consequently, individuals and organizations looking information that is held from the beginning to be relevant, but then turn out to be useless or are only of partial interest. So information is not taken into account, which are held to be irrelevant, which is actually useful but. The problem occurs because the real relevance of a particular fact or concept in such cases, obviously, after the fact was known at all. Before that, the idea was a certain fact to learn at all due to the improper performance of its irrelevance discarded. Consequently, the information seeker is caught in a paradox, and he fails to learn the things he actually absolutely needed. So he is a victim of its " intellectual blind spot ". The phenomenon of relevance paradox occurred while the intellectual development of the people in many situations in appearance and is therefore an important issue in science and education. A book entitled The IRG Solution employed in 1984 with this problem and proposed a general solution approaches before.

Echo chamber effect

A related concept is the echo chamber effect (English Echo Chamber Effect).

Search without personalization

Internet search engines, which explicitly in its basic setting waive personalized search, for example, Ixquick or DuckDuckGo.

334556
de