Nobias - Bursting the Virtual Filter Bubble
Have you ever wondered how reliable and transparent online sources are and how they may influence your perception and attitude towards important topics such as politics, business, or race? Since 2009, Google has been customizing users’ search results for marketing purposes, which not only influence consumers’ purchase decisions but also shape their virtual realities. Using Artificial Intelligence (AI) algorithms, websites are being tailored to each individual, giving them a unique online experience, a personalized universe of information. Each consumer’s online experience is shaping itself exponentially through every click, article, and video until they’ve created their own personal online reality. Just like in real life, humans create their perceptions and attitudes by focusing on information and news that sound familiar and are pleasant to them, as Eli Prisier, president of MoveOn.org, states in his book Filter Bubble; “The more often we click on similar articles the more articles with the same biases appear in front of us, leaving us in the belief that they report the truth. That’s a problem because it leaves less room for opposition, other perspectives, creativity, innovation, and democratic exchange of ideas.”
In her book Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy, Cathy O’Neil Ph.D. professor, hedge-fund analyst and data scientist, discusses how algorithms are increasingly regulating people and how biased most sources on the internet are. Pedro Domingo, professor, researcher in machine learning and author of The Master Algorithm: How the Quest for the Ultimate Learning Machine Will Remake Our World, lays out how online data reflects the social, historical and political conditions in which it was created and Artificial Intelligence systems ‘learn’ based on the data they are given. On a deeper level, the human brain is pattern-recognizing. It tends to look for information it already knows and that supports its current belief system. Humans are already influenced by primary biases by nature; algorithms reinforce this disadvantage and scale it to extremes, creating a worldview that might not be beneficial in the long run. The media denounces algorithms as “biased,” “racist,” and “sexist” and furthermore highlights how algorithms on Google, Facebook and Yahoo can be of huge disadvantage in business. It’s essential for business leaders to remain objective and to make important decisions having all facts at hand, looking at problems from different angles. They need to scan the world for upcoming competitors and understand different points of views in business and leadership. Additionally, algorithms even choose sources that may influence business decisions such as investing in stocks or companies. Users are presented with articles that emphasise one point of view, recommending either buying or selling certain stocks, and never get exposed to sources discussing the opposition.
Another problem with Artificial Intelligence and algorithms is the emphasis on outrageous content, regardless of its value or credibility. Since engagement and sharing rates of disturbing and emotionally charged content are 20% higher than usual, algorithms pick up on them and “shape consumers moral emotions in ways that could ultimately make it harder to change society for the better.” (M.J. Crockett, assistant professor Yale). If people are only connected to those they agree with, potential for political, racial and gender-related dialogue is lost. Understanding the opposition’s point of view and reasonings, instead of staying trapped in feedback loops, is essential to exploring solutions that lead to social change.
The question becomes how objective can one remain when browsing on the world wide web. Is it still possible to gain access to actual facts without being influenced by biases? In order to stay in control of their decisions and opinions, consumers want to expose themselves to many different points of view and ways of thinking while avoiding “fake” news and other low reliability sources. Nobias mission is to reveal facts and enable people to control the intellectual input they consume. It’s the solution to staying objective and in control when consuming content online, the answer to escaping false virtual realities. Used as a Chrome extension it helps individuals and organizations burst their “virtual filter bubble” by revealing the article’s biases and examining its credibility before they read the article, giving them back control over their online reality. Further, it reflects back biases users create for themselves. Civik Owl evaluates the quality of news stories analyzing them for credibility and political diversity. AI Now Institute at NYU has the mission to eliminate biases and inaccurate and unfair outcomes. “AI Now researches issues of fairness, looking at how bias is defined and by whom, and the different impacts of AI and related technologies on diverse populations.” News Guard works with a rating system which uses 9 criteria in determining if sources are based on opinions or if they are credible and transparent, such as not publishing false content, not presenting opinions as facts, and clearly labeling advertising. Deep News has the mission of “restoring the economic value of journalism by automatically submitting stories to the platform and scoring them on a scale from 1 to 5 based on their journalistic quality.”
Not only is remaining objective important when making important decisions in business, but also when it comes to social issues and opinions on race, gender equality, politics, and many other sectors of society. Users should have the right to gain access to the full scope of information on the internet, not only one point of view which infinitely echos back the same information. Staying trapped between false or limited information leads to inequality, a lack of education and ultimately many different “virtual filter bubbles” that fail to present the truth. Nobias is what lets users define their destiny instead of having algorithms control their reality.