Bless your biased big data and brain!

by | Aug 24, 2015 | Blog | 0 comments

Ready signBless your biased big data and brain!

We humans aren’t the only ones who are biased.

The big data we consult may be biased too.

Or more precisely, the complex computer software that programmers have inadvertently embedded with biases. The biases built into the codes as well as the results they deliver can be difficult to catch and fix.

For example, independent academic researchers examining Google’s ad-targeting system recently discovered that males were six times more likely than females to see ads for high-paying jobs, as reported by the Wall Street Journal. Oops!

Military veterans often complain that HR software disqualifies them when they apply for jobs. That’s because the systems don’t recognize the military skills described in their resumes as being transferable to civilian jobs. Oops!

And photo-tagging software can make embarrassing mistakes, such as mixing up concentration camps with jungle gyms and African-American men with apes. Oops!

The WSJ’s article Social Bias Creeps Into New Web Technology has more examples, which can not only insult key stakeholders but also expose organizations to legal risks.

So what are we expected to do? This is a significant question, since neuroscientists have identified more than 150 types of biases, many of them unconscious.

As I wrote about in Bless your biased brain a few weeks ago, it’s impossible for us to overcome our biases because of our brain’s biology.

Instead, we need to adopt specific processes that will help us take counter measures to prevent us from acting in a biased manner.

However, some of these processes rely on technology, which is biased as well.

Remember your mother’s advice not to believe everything you read? Well, don’t believe or follow everything you learn from big data either.

Keep in mind that we humans have some special skills that aren’t found in computers—at least for now.

These human skills include empathy, critical thinking and common sense.

We have to carefully deploy these skills to override what the computers are dictating about the decisions we need to make. For example, some additional counter measures we can take include:

  • Put yourselves in others’ shoes to consider the effect of your probable decision. Is a worker really going to be that less collaborative by working at home one day a week and avoiding a horrendous commute? Have some empathy and approve the request!
  • Ask whether the data makes logical sense. Yes, the team members may show high trust toward each other and their leader if they’re all male Stanford University graduates with engineering degrees. But how will that look to everyone else, especially if the organization is advocating diversity? And will the team’s similarities translate into better results? Not necessarily! Show some common sense, acknowledge the research and get a LEAF to join your STEM team! (See 3 ways to increase your team’s smarts.)
  • Cast a wide net. Take some time and ask a variety of team members, especially with varied backgrounds, to help you weigh in not only about the decision you’re making but also about how you expect to implement it. A great decision poorly implemented can be just as bad as or worse than an awful decision regardless of how it’s implemented. The more brains the better to help you think critically!

As we work with our biased brains and potentially biased big data, we need to both embrace our humanness as well as question everything.

What do you think?

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *