Confirmation Bias and the Importance of a Second Opinion

  Rassegna Stampa, Security
image_pdfimage_print

Security Organizations Should Remember to Seek a Second Opinion, Which Can Bring Bias to Light

According to Wikipedia, “Confirmation bias, also called confirmatory bias or myside bias, is the tendency to search for, interpret, favor, and recall information in a way that confirms one’s preexisting beliefs or hypotheses.” Unfortunately, confirmation bias is a big problem in the information security field.  It is one that often limits organizations from reaching their full potential when it comes to their respective security postures.

How can organizations understand how they succumb to confirmation bias, understand its effects, and work to reduce or eliminate it? I would like to take some of the topics discussed in the Wikipedia article and apply them to information security.

Manifestations of confirmation bias

● Biased search for information: Information collection is a vital part of any decision making process.  After all, decisions can’t be made in a vacuum and need to be made based upon evidence, data, and facts.  But what happens when the information that is collected is biased?  For example, I once knew of a security organization that could buy any technology product it wanted, as long as it came from one particular vendor.  Obviously, product evaluations and the procurement process there suffered from a biased search for information.  But what about when the bias is less obvious?  That is something that organizations need to be very aware of.  For example, if an organization is looking to prevent data exfiltration, it may immediately run to evaluate vendors in a specific, narrow market within information security.  But what about if there is another potential solution that would be a far better fit for the organization’s goals and its specific environment?  If the search for information is biased, the organization will never get to a place where it can opt for the more optimal solution.

● Biased interpretation:  Even if information is collected in a non-biased manner, it doesn’t necessarily guarantee that it will be interpreted properly.  In other words, confirmation bias can cause us to see information in a way that it doesn’t actually present itself.  For example, I once saw an organization evaluate a vendor claiming expertise in Advanced Persistent Threat (APT) detection.  The organization was convinced that this vendor would bring results before the vendor ever set foot inside the premises.  There was much excitement (and perhaps just as much trepidation) upon the vendor detecting what appeared to be three compromised systems.  What was the problem with this interpretation?  Those detections were made on data from the previous day, and the organization’s own internal security operations team had already detected and remediated the systems before the vendor even set up shop.

● Biased memory:  In some cases, we may remember pieces of past events that suit our view of what happened.  For example, a security leader may recall that a solution they chose and that was implemented on their watch worked out quite well and produced the desired results.  It would be painful to recall the project otherwise.  However, the members of the security organization that designed, implemented, operated, and maintained the solution may beg to differ.  Perhaps in reality the solution wasn’t much of a solution at all.

Effects of confirmation bias

● Persistence of discredited beliefs:  When confirmation bias is in play, discredited beliefs can persist.  Obviously, when information is collected, interpreted, or remembered in a way that is biased, it can be tailored to suit a desired belief structure.  For example, most organizations insist on wildly complex passwords that almost no one is capable of remembering.  As a result, many people either write them down, save them in their browsers, or worse.  So, while ridiculously complex passwords may look good on paper, in reality, they aren’t particularly effective.  Of course, passwords shouldn’t be so simple that they can be guessed or brute forced quite easily.  There is clearly a balance, after which point returns diminish, or perhaps vanish entirely.  Yet confirmation bias allows organizations to believe that insanely complex password policies are helpful and make the organization more secure.

● Preference for early information:  I find this point fascinating.  Humans clearly display a preference for information they receive first over information they receive later.  It’s easy to see how confirmation bias can play a role here.  If an organization is scouting for vendor solutions to a given problem, it may let its confirmation bias limit itself to a subset of the relevant solutions.  Even if the organization brings in solutions that may be a better fit later on, it will likely still prefer the earlier information it received.  That may cause the organization to mistakenly choose a solution that is not the right fit.

● Illusory association between events:  People have a tendency to connect dots between items and events that may not actually be connected.  I have seen incident investigation and response go awry because of this.  In fact, I have been guilty of it myself more than once.  It’s a real risk that organizations face, and it’s one that they need to be aware of and cautious of.

Suggested approaches to reduce or eliminate confirmation bias

● Sound methodology: Regardless of what security function we’re looking at, the more objectivity (lack of bias) and the more rigorous a methodology we can introduce, the better.  Whether we’re looking at procurement, security operations, incident response, risk management, compliance, or any other function, the goal should be to replace “I think” and “I feel” with “I know” and “I understand”.

● Fresh set of eyes: We all get tunnel vision from time to time.  It can be hard to see the big picture when we’re in the thick of something.  Sometimes, bringing a fresh set of eyes to help us see beyond the weeds can go a long way towards helping us eliminate our confirmation bias.

● Second opinion: When it comes to medicine, I’ve often heard the advice, “get a second opinion.”  The same is true in security.  Before implementing a new process, making a big decision, or changing policies and procedures, seek a second opinion.  Perhaps confirmation bias has set in, and a second opinion will bring the bias to light.

view counter

Joshua Goldfarb (Twitter: @ananalytical) is an experienced information security leader with broad experience building and running Security Operations Centers (SOCs). Josh is currently Co-Founder and Chief Product Officer at IDRRA and also serves as Security Advisor to ExtraHop. Prior to joining IDRRA, Josh served as VP, CTO – Emerging Technologies at FireEye and as Chief Security Officer for nPulse Technologies until its acquisition by FireEye. Prior to joining nPulse, Josh worked as an independent consultant, applying his analytical methodology to help enterprises build and enhance their network traffic analysis, security operations, and incident response capabilities to improve their information security postures. He has consulted and advised numerous clients in both the public and private sectors at strategic and tactical levels. Earlier in his career, Josh served as the Chief of Analysis for the United States Computer Emergency Readiness Team (US-CERT) where he built from the ground up and subsequently ran the network, endpoint, and malware analysis/forensics capabilities for US-CERT.

Previous Columns by Joshua Goldfarb:

Tags:

http://feedproxy.google.com/~r/Securityweek/~3/Lvdblbka2fQ/confirmation-bias-and-importance-second-opinion