June 20, 2011 § 23 Comments
TED Talk on The Filter Bubble.
More on The Filter Bubble, including “10 Things You Can Do”: http://www.thefilterbubble.com/
The filter bubble can be helpful when it comes to searching for something you may want to see, but I also agree that it is shielding us from things that we are not interested in seeing, but should see. For instance, Eli talks about his one friend who recieved information on the protests in Egypt while his other friend recieved information on vacationing in Egypt. I do agree with the filter bubble that it is more efficient and convenient, but I also see it keeping us from things that may be of importance. On facebook I have almost 300 friends and recently I have begun to notice that I see at the max 50 of them on my home page. This can be convenient when I’m looking to see what my close friends are doing, but it also seperates me from my other friends who I’m not in contact with as much. Eli says how the internet should connect us and I believe that how facebook is filtering my friends is keeping me from doing that. The filter bubble is helpful, but also hurtful in the sense that it does not let me expand my horizons and it chooses what I get to see.
Before commenting to the information and argument Eli Pariser present in his TED speech about the “filter bubble”. I would like to give some background information on Pariser that would provide information of where his point of view lays and what his possible bias might be. Parsier is a progressive thinker which he states in the speech, but he also created many websites to push the liberal progressive agenda such as moveom.org. Granted he says he enjoyed other views, but the whole reason Facebook got rid of his conservative friend’s views was he wasn’t clicking on their links. So would that not make one conclude he may not be as open to other ideas as he claims? It just a thought.
I find the facts that he gives the viewer about that search engines take in over 57 signals of a user to provide the most relative information to them, but he even said this factoid comes from a friend. The example could be relevant, but there was only two so if this was a study it would not whole water. Even though I am discrediting Parsier I will agree there are these filters out there, anyone who used Facebook from beginning will notice ads are edited towards a users likes. While friends updates disappear if you don’t click on them often. I found this filtering troubling because for one too understand the whole picture, or close to the whole picture, you need all sides of the story and if information is filter how can you every understand anything that not customize to your point of view the filter feels you have.
Argumentation: The use of algorithms diminishes the function of Internet as a tool to connect everyone.
After watching this video, I started to realize that Facebook does filter some of the information that appear on my wall. Then I felt like I am controlled on what I can see and cannot. If this is the case to all the webs, it can lead to the fact that many people would be known of what they want and/or like to know rather than what they should have known of. As a result, people is bound to a particular kind of knowledge and belief and can’t extend their knowledge. The filtering of the algorithms is, to me, similar to the authority fixing belief method. The use of filtering helps to connect faster to the familiar information faster but it also prevents having access to important one.
Although I agree with what Pariser is saying, I don’t know how we can avoid the filter bubble. Who can determine what someone needs to see? What someone wants to see can be determined by the links they click on but needs aren’t as clear-cut. By his standards, this consists of relevance, importance, discomfort, challenging information, and other points of view, which are just as vague as the term needs. I feel as though he’s holding the Internet responsible for people’s lack of concern. Algorithms don’t block certain views for their own political efforts; they are based on what would be the most interesting. Even if algorithms can somehow determine need, they cannot force people to read the content. When there was a limited source of information, people relied on what was given to them but with the Internet, people have to take it upon themselves to go and look for it. How will we know to search for Egypt if we don’t know that there’s anything important happening there? Regarding Facebook, there’s an option to allow all friends to show up in your news feed; those who don’t realize that Facebook is hiding their other friends wouldn’t know to change it. In order for the Internet to show us what we need, we must first be exposed to the need and then want that need.
we are all trying to be as open for different ideas, perspectives toward the world as possible. And we all believe that Internet is a powerful tool to fill in that gap. However, I am pretty shocked to learn that all the search engine was designed to customize the result tailored to our taste. Even though it sounds very businesslike “satisfying the customers personally like”, it is not ethically right since they do not actually know what our motives are. And it is unlawful to narrow down our view by itself without us knowing.
I feel this filter bubble system is quite ingenious. It seems that this system is used to direct advertisements as well, particularly noticeable in Google ads. I found a small and recent discussion regarding the filter bubble/this TED talk on the Google forums.
This brought to my attention the ethical implications of this technology. While people may not care deeply about the content that is being filtered out, but they definitely wont care about it if they are not exposed to it. For example, I may not be concerned with unrest in the Middle East area, but I could become sympathetic if exposed to the situation. I think there should be a bubble filter check box when conducting a search, similar to that of the safe search that Google has.
I feel that there is nothing wrong with instance desire which I feel is the idea of the filter bubble; however, closing knowledge to people is only going to encumber us as a society. To me the filter bubble is similar to that of China & North Korea, but not it’s intentional. Like Eli Pariser’s two friends example, how one friend received information about the protest in Egypt while the other received traveling advice if one wished to vacation in Egypt. This idea of shielding society will not help us in anyway. This filter is similar to the movie The Truman Show, someone creating this made up reality and because we do not know otherwise accept the imaginary reality presented to us. Eli was correct saying that a balance has been lost. That lost is between truth and relevance because not everything that is relevant information is true and what is true may not be relevant information.
I think that Eli’s claims are completely valid. However, if we take Google as an example, the filter bubble can be seen as a feature that provides a more effective search. When people access these social media and web search tools, most of the time they are looking for something specific which is where the filter bubble becomes useful. Nonetheless, I agree on the fact that users should be the ones to decide what they want to see. There should be a middle ground were people are still getting the best of this tools (efficient search) without being excluded from relevant information and reality.
I agree with Kathy. I think algorathims that tailor search results to previous history are here to stay. I hope that I retain a choice in the matter as to what gets sent to me.
I thought the most interesting point was that in order to have a functioning democracy, we need to have a flow of good information to citizens. Than would that mean we did not have a functioning democracy before media outlets or when we had lower literacy rates.
If google and facebook decide to filter my results, that is there choice and I think that as a consumer if I don’t want them to do this, I wont use them. This leaves me with two options. One I can stop using the services these companies provide. It is possible to get by without these services. The other would be to demand companies that preform the same services without filtering results. If enough people demand it than new companies will emerge to meet those demands.
I have always noticed a filter bubble on my internet. Especially in Facebook, I notice that I only see a solid group of about 50 friends, as other people have mentioned in their posts. Even when I go to to search a person’s name on Facebook, I find that people’s pages I click on the most will come up first before the person I am actually looking for comes up. I also find this on the website stumble upon. While it is supposed to allow you to search through sites of interest, clicking on certain topics. With that, even if I click on cooking and music, every time I stumble, music sites with similar music and cooking sites with similar recipes will continue to pop up. While I understand that the internet is supposed to be our own unique online, I do not find it to be that whatsoever. Just because I typed in how to make red velvet cake once, doesn’t mean that I want to know how to make red velvet brownies, cookies, or cheesecake. I want to have more control over what I am shown, possibly learning a thing or two on the way. Right now the internet is so boring because the same things are shown to be on a daily basis.
This presentation brings up an excellent point. The Internet was intended to be an unfiltered global resource of information for anyone. Yet today more and more companies are filtering how we view the Internet. I was aware that Facebook monitors what websites you visit to try and tailor the advertisements to “your likings”; however, I was unaware of the scope that companies are filtering our Internet experience. I do understand that technology is supposed to make our lives easier and these companies are trying to help do that. Filtering helps me find what I am looking for quicker and even if it I don’t find what I was looking for in the first search, I am sure I will find what I’m looking for in the second search. What Google is doing is not making any information less available, but trying to ensure you get what information your looking for based on your past habits. I think this is a good thing and I don’t think that this issue is threatening at all. This world has become very fast-paced and companies like Google are trying to make sure you save as much time as possible.
Personally I believe that internet filters are necessary and essential to successful navigation on the internet. There is a lot of information on the web, most of which is irrelevant to what you are really looking for. Some may argue that you can never be sure that the information filtered from you is irrelevant. To those people I would respond that you most likely would not explore outside your set patterns anyway so it would not matter if it was. If you were one of the few that does prefer to view all sides of argument or issue then you would realize it is important to use just more then one search medium e.g. Google, or one source of information e.g. the internet. If you truley wish to have all the information you should and will be required to look for it and not just hope it pops up on your screen.
Eli is completely right about what he said. I have seen this first hand from Google and Netflix. When I tried to direct someone to a website via Google, I would tell them something along the lines of second or third link. However, it wouldn’t be up there but instead down the page farther or on the second page. And when I use Netflix on my Xbox, instead of being able to see all the categories, I can only see those based on movies and TV shows I have already seen. The facebook issue he mentioned isn’t a problem. Through your profile, you can turn off the filter on your news feed. This is exactly the kind of solution I was looking for. Continue to filter but give the option to turn it off.
I can understand the argument that by filtering results, we isolate ourselves. Yet his conclusions on repairing itself appear to fall into the same problem: trusting algorithms to make a relevant yet diverse selection of material. If the current dilemma is the fact that the algorithms close off our world, how can we expect more sophisticated algorithms to solve it? The truth of the matter is this: there has always been people trying to isolate our world, whether for political, monetary, or spiritual reasons. No matter how sophisticated or complex an algorithm may be, I believe it is always the individual’s responsibility to expand their surroundings and encounter new ideas. We will always be personally responsible for opening up our world view, and no algorithm can take away that responsibility.
I have had alot of the same problems that Jordan has had. Last term I had a group project and we had to share links. I told my group member, who was writing the bibliography, to type in a certain word in google and use the second link. However, after they did this I noticed that the link on the bibliography was completely different. This is the same with netflix, once you watch one movie you are recommended other movies that are similar. To me this is one instance of the lack of diversity that the filter bubble creates. There are both negatives and positives to the filter. One positive is that the technology of filter gives you generally the feedback that you are expecting. Which isn’t that what the world is coming to? Finding the easiest and most efficient way of doing things. On the other hand one of the negatives is that it does create a less diverse person. If you are constantly being fed back what you expect from search engines it wont open your eyes up to anything new. An example of this would be Stumbleupon.com. It is a website that basically gives you webpages that it thinks you want to view based off your input(like/dislike)from the previous webpages viewed. After a few weeks I felt like it was molded so much to me that it only showed me websites that I wanted to view. After viewing the video, I realized that if anyone else went onto my stumbleupon they wouldn’t find it as interesting. So basically this makes me a less diverse person because I have my likes stored into a computer rather than sharing them with others and also the fact that I dont know their interests/likes.
I actually researched and wrote a paper as well as a presentation on this topic in my organizational behavior class. The original paper that I had to read was about how Facebook is on a new endeavor to become a publicly owned company. They would do this by basically segmenting Facebook users into market segments based on their likes, dislikes, religions, culture, interests in movies/music/literature/novels, etc. Even users political viewpoint would be segmented off so that eventually, Facebook users would be a part of their own type of network in which many other users of the same interests are a part of. After this, Facebook would sell off these certain market segments to advertising agencies and companies that want to focus their product(s) marketing to these specific users. When I researched more into the topic, I found links stating that Google, and other search engines such as Yahoo!, want to, or are currently doing the exact same thing.
Filter bubbles are in fact the new age of what the internet and website providers are using to gain more capital into their organizations. Although this is really ingenious for their own well being, it does really create a negative impact for us as users of the internet. Eli’s discussion on on using algorithms to solve this type of filter bubble that has come in my opinion could possibly work. In the end however I believe that using search engines such as Google or Facebook as a whole should be stopped from being used to truly solve the dilemma. I believe one of the comments above stated that no matter what the algorithmic solution might be, we as users of the internet to actually have the freedom of not having the filters that are used today by these companies must simply stop using these websites.
As a Business major I can see the impact that filtering off certain things to different users creates more revenue for companies such as Google and Facebook. But, in the end of it all, I also think that doing such a thing takes away our freedoms as users of the internet. The internet is not so much a free way for us to gain knowledge and acquire incite on certain things, but rather just a forced application of what corporation’s want us to see and learn in terms of their own benefit.
I absolutely agree with what Eli said. However, it is necessary for us to think about why the filter bubble is so common. I believe the biggest reason of its presenting are the competition between internet companies and the development of the technology. But we cannot think something from only one situation. The presenting of filter bubble does create an environment that people can only see what they want to see, but since every company can make this filter bubble, they have to invent or update their technology so that it can attract more people. This is just like an endless cycle. Maybe the filter bubble has take away a lot of information, but we can figure it out by many other ways, such as search by different key words, or use different engine. Therefore, in order to have a better internet environment, we need to improve the technology, but we also need to improve our point of view.
I found this video, and the notion that I am surrounded by a “filter bubble,” to be quite alarming. Mr. Pariser mentioned that he used to envision the Internet as a way to bring everyone around the world together; for a long time I saw things the same way. However, how can we come together via an Internet that is designed to separate my knowledge of news and current events from that of my neighbor? Like almost everyone else in our class, I notice that Facebook often makes decisions on my behalf, but I never paid much attention to it. I never realized that Google did that as well, though, especially to the degree that Mr. Pariser described. The side-by-side screenshots of Google search results for his two friends were jarring and eye-opening.
I understand these companies’ attempts to bring us relevant information more conveniently. However, this is outweighed by the fact that they determine what “relevant” is:
1) while most people are unaware of the filtering,
2) without allowing users a way to see how the filtering is done, and
3) without factoring users’ offline, “real-life” experiences.
My life and my expectations of the Internet are greater than a collection of 57 parameters passed along to Google. In my opinion, the most important aspect of Mr. Pariser’s presentation was his appeal to Internet gatekeepers to make the filtering process more transparent. If I am not seeing the whole picture, than I at least want to know why!
To filter or not to filter? As a result of the massive amount of information on the internet, filtering has become a necessary evil. In addition, the interconnection of all that information has become more and more woven together. Features such as “like this” and “share” allow programmers to see what we find interesting and how we actually use the internet. Eli brings to light an argument, which like he said is very similar to the early news reporting in this country. When does filtering go too far and filter out the information we need/should be seeing? This is a very fine line, because as I said earlier we need some filtering. The question I will ask you is, who responsibility is it to determine what to filter or not? Can this be even done? I argue that the responsibility falls on our shoulders. Google may filter out the information it “thinks” you don’t want/need to see, but it’s still out there floating around in the web. I believe Eli is barking up the wrong tree. The internet’s life blood is commerce and Google will filter the hell out of everything to get us to click on this or that with the hopes we’ll buy something. So what makes you think that they are going to feel some civic need to expose their users to the proper information? Again I say that the responsibility lands on us just as it did in the past.
Personally I think that the “web filters” that Eli Pariser speaks about are not a problem and are simply a needed evolution of the way information is delivered to us as consumers. Everyday many things take place that hypothetically could be written about in the newspapers; a leaf falls from a tree, cars break down, people get in fights, etc. Though somehow, the unimportant trivial incidents that are not newsworthy are not written about, and the Septa strike or the multi-car pileup on I-95, are the topics that make the front page of the news. Similarly, there are millions of websites on the Internet that are not relevant to us as individuals or are so trivial and commonplace that they just don’t warrant our attention and would only overwhelm us if we gave all of them our full attention. The algorithms that many websites have put in place have only made us more efficient in our search for content to consume on the Internet. As with all endeavors that Google undertakes, I am quite sure that they have the statistical and quantitative evidence to prove that.
My point is that, there is more in life that what is available to us on the internet, our brains are highly capable filters that pick and choose what to resonate with us and what to allow us to forget. Similarly the algorithms in place, particularly on Google and Facebook, are designed to do the exact same thing. Unfortunately as with most forms of artificial intelligence, they don’t hold a candle to the thing they try to emulate, but that in itself does not make them useless or evil; of course there is always room for improvement and that can be said about the algorithms. With or without them, we as people ultimately make our own decisions about what content to consume and what to ignore. The same way I skip straight to the sports section of any newspaper that I pick up and then slowly filter my way back to other articles that I find interesting so too do we as we use the internet, go to first the things we consider most interesting, and then after consuming that, filter through the rest of the results to see if there is anything left of relevance. The algorithms only facilitate that process and make it more efficient. That said they aren’t perfect and could be better but that is no reason to get rid of them entirely.
This argument makes total sense to me! If I am trying to search something on my own computer and I do not get the same results someone else gets that somehow does not seem right. For instance the example that was given with the search of Egypt one of the friends had links related to the protests going on there and the other friends’ screen shot had nothing to do with it. I really feel that the point he made about having searches only being done on relevancy to the person doing the search as something that is not right, his point was that important information should also be a priority in the searches because no matter what the relevancy is to the person it needs to be part of the search. Hopefully the right people will see this clip and make the right changes because now that it seems internet is the number one way people are getting their news its really that much more important.
Eli Pariser makes valid points on the effect that filtering is having and will continue to have on the information individuals are getting based on their preferences. He does mention the great progressiveness of filtering via algorithms, but this improvement in technology is taking us back to a limited world view because the access to facts and different opinions are hidden. We look to the internet, especially search engines, to find answers to our questions about recipes to wars and we think that these answers are un-biased giving everyone the same information so that we can form our own opinions. Having algorithms give us information based on the opinions we already have eliminates the chance for that opinion to change and to get a fuller view of any situation. When talking about Netflix, Pariser mentions our aspirations for our selves versus are impulsive selves and the problem that algorithms aren’t at a place to understand the difference. The internet needs to be a place where these aspirations can live on and there is a full access to information regardless of our individual impulses. If algorithms differentiate between the two, we would be able to have a great mix of convenience and knowledge.
I feel that there are definite pros and definite cons to this filter bubble concept. I completely agree that there is a filter bubble in this society but whether or not it’s a good idea is a different story. I’ve come to realize in almost every website I go to there is definitely a controlling of what I see on the screen as opposed to what other people see. For example just on Facebook there is a definite difference between what I am seeing and what anyone else is seeing. They have come to decide, from what I click on and whose page I go to, what I want to see. Now this can be a good thing since it could filter out all the people I don’t want to see or information I don’t need to see but who are they to control me and my actions? This filter bubble keeps people so contained I feel like and slowly but surely I’m thinking that people are going to become more and more close minded from being incompetent about some issues since things like the Internet decide for them what they should see.
« Todd May on Friendship
You are currently reading Eli Pariser at Tuesday Morning Blog.
Blog at WordPress.com.