Say What? How Unconscious Bias Affects Our Perceptions

Estimated Reading Time: 5 minutes

NRMC

By Eric Henkel

Resource Type: Risk eNews

Topic: General

You probably regularly encounter situations where you are convinced that you know exactly what is going on, only to find out that things are the exact opposite of what you thought. When it comes to how we view our environment and the people we interact with, there is often a disconnect between what we think should be and what actually is. This disconnect is usually the result of an unconscious perception error or bias. As you make decisions and take action based on your perceptions, there is a risk that these unconscious biases could negatively affect the outcome. Luckily, there are steps you can take to minimize any negative effect.

People have general tendencies in how we process information and establish meaning that drive the dissonance between our perceptions and our reality. There are many types of cognitive bias but there are two basic examples that regularly come into play when we try to make sense of the world.

  • Confirmation bias: As we try to determine meaning in the world around us, we subconsciously give weight to information that confirms our existing perceptions, and we discount information that would force us to re-evaluate these perceptions. Common examples of confirmation bias are the recency effect (our tendency to give more weight to information that we have encountered more recently) or the primacy effect (our tendency to seek out information that confirms our first impressions of things).

Confirmation bias might come into play in the workplace, for example, when you determine the strategic direction of your organization. You might be convinced by recent stories in the media or at conferences that a particular population could be served by your organization despite the fact that this service would be a distinct departure from your typical operations. As you begin to outline a proposal for your new line of service, you unconsciously pay more attention to and incorporate information that supports your proposal–as opposed to information that would convince you to stay the course you are on.

While it is often necessary to take risks and accept some degree of uncertainty when planning a new program or service, confirmation bias can make it harder for us to objectively identify and consider those risks. To reduce confirmation bias and ensure that your exciting new plans are truly logical and feasible, take a step back and reflect on the proposal from an outsider’s perspective. If you knew nothing about the proposal and were reading it for the first time, how would you react? You might unearth some risks or challenges that confirmation bias had shrouded.

  • False consensus effect: We overestimate how much other people think and behave like we do. This error in perception may cause us to believe that other people agree with our decisions and actions–even when they don’t. Since people have a tendency to associate with other people with similar opinions and views, we also think those people see things the same way we do.

False consensus effect comes to life at work in many scenarios. For example, as an executive director, you might be convinced that a new organizational initiative is the correct path to pursue and that your board will completely support your efforts. However, when you share your plans with them, you might find that the board holds a completely polar position. Inability to anticipate a different point of view or to reconcile two distinct perspectives can lead to a deadlock, which prevents your organization from moving forward. Our September 28, 2016 Risk eNews article offers three practical approaches for engaging team members with varying perspectives and reaching a reasonable compromise, rather than false or too-quick consensus.

There are many other biases that cause complications as we look specifically at individual behavior. Errors often occur when we try to assess our own input and that of others relative to the outcome of implementing decisions. When we attempt to attribute success or failure to various decisions or behaviors, two key biases come into play:

  • Self-serving bias – When it comes to evaluating our own decisions and behaviors, we have a tendency to attribute successes to our personal characteristics and to blame failures on external causes. For example, if a donor makes a significant contribution to your organization, you are inclined to take credit because your interpersonal skills helped cultivate the relationship that made the financial support possible. On the other hand, if a potential donor does not make the expected contribution, you will likely blame an external factor like the economy for impacting the donor’s decision as opposed to anything lacking in your personal skill set.
  • Fundamental attribution error – Conversely, when we evaluate the outcomes of actions taken by other individuals, we place more weight on factors that we relate to their personal characteristics as opposed to external factors that may affect them. One example is the view that a volunteer is unreliable if she misses meetings, when the true culprit of her tardiness was that the meeting announcement didn’t reach her in time. Similarly, when a director’s program fails to produce the intended impact, a colleague’s initial assessment may be that the director lacks leadership skills. The colleague might realize he attributed the blame incorrectly when he learns that the program lacks sufficient resource support.

Whether it’s a lack of accuracy in our perception of the world around us or a more specific bias in how we analyze information and behaviors, there are steps that nonprofit leaders can take to minimize negative effects. The first step is acknowledging that these biases exist and can impact our decisions and actions. The next step is to increase your awareness of specific values, beliefs and perceptions that you hold. Self-awareness is key in creating an empathetic approach to interacting with other people. Finally, increased interaction and communication with your colleagues will help you reduce the risk of misinterpreting their intentions and subsequent behaviors.

Working to identify and minimize the effects of perception bias can help you generate a more accurate picture of the world and people around you. Ultimately this helps improve your communication and decision-making with others.

Eric Henkel is a former project manager at the Nonprofit Risk Management Center. 

SIGN UP FOR THE RISK ENEWS!

Sign Up Risk News

Name*(Required)
Privacy Policy Agreement(Required)