FORBES: Veer Into The Other Side: How To Fight Implicit Bias In The Digital Age

The side-view mirror on the passenger side of your vehicle was once an add-on.  But over time, what was once considered a luxury feature became a necessity. As the number of multilane roads and highways grew, it became harder to look around us as we drove. With progress came the need for new technology.

I can think of no better analogy for where we find ourselves today with regard to our online political discourse.

There are people to our left. There are people to our right. But more often than not, we’re cruising down the information superhighway, drafting closely behind whatever car looks the most like ours, seldom aware of our massive blind spots.

Then we’re shocked when a car veers into our lane (or we veer into theirs) and a catastrophic collision of ideas occurs. If only there were some way to know our political blind spots.

Our Implicit Blindness

Here’s the foundational problem: We all have blind spots. Technically, these blind spots are known as implicit bias. Emily Badger at the New York Times defines implicit bias as “the mind’s way of making uncontrolled and automatic associations between two concepts very quickly.”

Like any rule of thumb, this process can both help and hurt us. For instance, implicit bias lets us follow familiar patterns like navigating our morning commutes. But it can also reinforce bad intuitions like negative stereotypes, which can lead to harmful results down the road.

To make matters more frustrating, we’re blind to our blindness. As psychologist Daniel Kahneman wrote in Thinking, Fast and Slow, “We have very little idea of how little we know. We’re not designed to know how little we know.”

The Challenge To Interrupt

So, how can we reveal our blind spots?

1. Train yourself to detect website bias. 

The next time a divisive issue explodes on your social media timeline, consider conducting this “exercise for bias detection.” By collecting screenshots of several news outlets across the political spectrum after news of a notable event breaks and ordering the screenshots from most extreme right to most extreme left, you can gain a better understanding of a website’s implicit bias.

However, this is a time-consuming process. It’s a helpful process for a one-time study on your own, but it’s not feasible for every major news story.

There are also some other quick options. AllSides is a website that reveals the biases of news articles. AllSides and Media Bias Fact Checkcrowdsource their bias ratings. Using such sites might save time when reading a broad survey about any news story, but it's important to ascertain the credibility of a particular rating system.

2. Recognize how sites (even 'objective' ones) identify and respond to your biases. 

The effect of an implicit bias goes far beyond which articles you read; with every click, your choices inform which ads you see and what types of articles pop up at the top of your future Google searches.

Rather than working to expand your information horizons or identify questionable content, major sites and search engines function by doubling down on your implicit bias. If you click on three articles about your favorite sports team, you might only see positive coverage of that team in the future.

Don’t think of your internet activity as searching through a library catalog; it's more like having a constant conversation with assistants who only want to give you what you like.

Needless to say, compounding your bias online can be unproductive and even dangerous. So as far as that goes, the best advice is simple: Think before you click.

3. Confront your implicit bias and take control of your information. 

Collectively, we must daily fight against our implicit biases (particularly in today’s political climate).

There are more traditional ways to check our biases: We can stop relying on just one outlet for our news, we can pay more attention to individual authors and we can go out of our way to fact-check our sources. But sometimes -- as with the advent of the side-view mirror -- a new technology can go a long way.

Just as we need multiple tools to keep us as safe as possible in our cars, new add-ons and sites can help us keep our minds on the right track. And when it comes to new tools, artificial intelligence (AI) and machine learning may hold the key.

How AI Could Thwart Our Biases

As Tomas Chamorro-Premuzic argues in a recent Forbes column, “AI has two big advantages over human beings: first, unlike humans, AI can learn to ignore certain things, such as gender (which, in effect, means unlearning certain categories); second, unlike humans, AI doesn’t care about maintaining a positive self-view (and failing to maintain a positive self-view will not make it depressed, for AI has no self-esteem to protect).”

Self-driving cars have no blind spots. Their AI constantly surveils the areas around the car, alerting the passenger to any possible dangers -- and even taking control of the car when necessary without human involvement.

Now it's time to see if AI can also help us address our blind spots online.

To that end, my company created a Chrome extension that places a discrete icon next to Facebook news posts and shares, Google News search results and links in your favorite news sites. Hovering over the icon reveals the source’s political slant and credibility rating based on reliable data and a proprietary AI algorithm.

Knowhere is another tech company looking to combat this issue by training AI to rewrite news stories that are devoid of bias. According to SingularityHub, "The site uses AI to aggregate news from hundreds of sources and create three versions of each story: one skewed to the left, one skewed to the right and one that’s meant to be impartial."

So look over your shoulder and check every mirror. With help from AI tools, we're about to change lanes.


This article has previously been published on Forbes

Previous
Previous

DAILY DOT: Can a browser extension combat political bias in your news diet?

Next
Next

TECH RADAR: Should you trust the news you're reading? This browser extension can help