Sunday, October 8, 2017

Are Your Search Results The Same As Mine?

One of the best aspects about being a scholar and educator that focuses a lot of their research in the technology field is that there is always something new to become informed about.  In other words, there is never a dull moment!  Have you ever thought about what results you receive from doing a Google search and how such results compare to a friend’s results of the same exact search?  Well, go and try it some time as you might be surprised.  The terms “filter bubble” and “echo chamber” might not mean too much, if anything, to you.  In fact, such terms meant nothing to me three days ago as well!  Before I discuss the two, possibly, foreign terms, let us all understand that the internet that we view on a daily basis is composed of many algorithms that guide such vast and diverse information as to what ends up in front of our eyes, on our computer screen, and ultimately what we view with each passing click.

When one rationally attempts to understand the meaning of both terms filter bubble and echo chamber surrounding our internet usage, I believe we can come to an understanding that the internet’s information is being filtered into some sort of bubble that also involves a chamber of echoing searches, clicks, friend’s searches…etc.  Specifically, a filter bubble is the use of algorithms to filter certain information into a personal bubble, where an echo chamber also uses algorithms to echo or communicate what clicks, searches, friends selection, friends’ searches…etc. to the filter bubble in which you exist in.  Essentially, both filter bubbles and echo chambers are working together to provide you with exactly what you “want” to see.  As Pariser (2015) states in regard to such algorithms, “After all, they mediate more and more of what we do. They guide an increasing proportion of our choices-where to eat, where to sleep, who to sleep with, and what to read. From Google to Yelp to Facebook, they help shape what we know.”  If this is your first time hearing of such aspects of our internet, yes, I too had some strong feelings regarding such actions.  However, it might not be all that bad as we might have some say in such operations.

Critics such as Eli Pariser have made such internet operations known and, as they should, some of the “culprits” for creating and using such operations have responded.  According to Moon (2015), “Facebook wants you to know that you’ve only got yourself to blame for the lack of diversity in views on your News Feed… ‘Filter bubble’ is what you call the situation wherein a website’s algorithm shows only posts based on what you clicked (or Liked) and commented on.”  Basically, Facebook is responding to such allegations by blaming the algorithms’ functions on the users themselves.  Saying that the algorithms only respond to what the users input into them, in terms of their choices of media to view.  Where Facebook is technically right, the fact that such algorithms alter what we see at all is a cumbersome thought in itself.  Although, without such algorithms how would the vast and diverse amount of information on our internet be sifted?  I suppose by most recent creation? Or, how about alphabetically?  I suppose we can’t be too upset with such organizations such as Facebook, as they kind of are just doing their job in a way they think might make the user satisfied.

Now that we have discussed a major part of what truly shapes the internet and how we view it, how has it possibly affected us as a society and what does this mean for our future usage?  Well, when we take a step back and think about what the results of such filter bubbles and echo chambers have caused, I believe it is very rational to understand that we have been exposed to basically only our view point.  Information that has an opposite viewpoint has been filtered out, ultimately to better our viewing experience.  Does this mean we have unknowingly been sifted into a society that is filled with individuals which have difficulty understanding opposing thoughts and views on different aspects of life in general?  Could this have lead us to much of the hatred that has recently been present within our society?  The answers to these questions are currently just not fathomable, as there has not been extensive research to back up such claims.  Thus, rather than looking into the past we must look into the future.



Eli Pariser (2011), in a TED presentation titled Beware online “filter bubbles,” talking to software engineers in general states, “…People who have helped build the web as it is, and I’m grateful for that. But we really need you to make sure that these algorithms have encoded in them a sense of the public life, a sense of civid responsibility.”  In other words, it is essential that we are not only coming across and viewing information that is solely chosen to satisfy our current views based on what we personally click on or what are friends on social media click on.  We must have diversity in what we view, ultimately allowing us to become more understanding or at least educated on society’s opposing views.  Although Facebook has released information on updates that they are working on and have released to help combat the filter bubble, as Pariser (2015) suggests, Facebook is still in control as to what information is released to the public as studies have to get Facebook’s permission in order to be released.  However, hopefully software engineers see the value in not “trapping” their users in filter bubbles and echo chambers, ultimately offering our society more “hard” news, the news that is of true importance and unbiased to our ideas, rather than “soft” news, the news that surrounds only our interests.  As current users, being informed and informing others on such internet operations can only help our view on society as a whole, as it might potentially help limit society unknowingly falling into a self centered world.  After all, as Pariser (2015) states, “The more we’re able to interrogate how these algorithms work and what effects they have, the more we’re able to shape our own information destinies.”

2 comments:

  1. How do you feel about Facebook blaming users? Do you think it is your fault that you only have certain friends with certain beliefs? I wonder if the Internet does need to be filtered based on our interests or perceived "likes". Would you rather discover these things on your own or have them curated for you electronically?

    ReplyDelete
  2. I think Facebook did its part in responding after they were accused of what comes off as negative and unethical actions. I believe it did its job in creating an internet that allows its users to have a positive experience. Although, I hope they take into consideration the possible negatives of such algorithms and ratify them in a more diverse manner. I understand there is a ton of information on the internet, but why do we have to have algorithms that chart our interests anyways? Why not just use tags, labels, and most recent creations to order what we view?

    ReplyDelete

Britannica ImageQuest LIS 568

As students begin to realize they cannot just simply take and use photos from Google without citing and giving the appropriate cre...