Consumers are receiving more healthcare information than ever before. Fitbits, apps and electronic health records provide loads of personal data. Rising patient autonomy has led to greater demand for and access to registry, clinical trial and post-marketing surveillance data. And forums and feeds serve up endless opinions and resources.
As communicators, we often create content designed for patients’ real and metaphorical inboxes. While information is power, helping patients understand their own tendencies and how they can best assess information is also powerful– especially as the healthcare system navigates the shift from a rigid “doctor knows best” framework and patients play a greater role in the treatment decision-making process.
As human beings, our ability to process information is far more limited than the size of our inboxes. Cognitive biases–or mental shortcuts–allow us to cut down on the cognitive burden of information overload. This is why the cognitive biases we use when evaluating information may be just as important as the type of information we receive. Yet while a good deal of attention has been given recently to how healthcare provider bias can influence patient care, most notably in the areas of gender, race and weight, less attention has been paid to how cognitive bias influences patient choice and behavior.
There are hundreds of cognitive biases that have captured the attention of psychologists, economists and even gamblers. And while the Ikea bias (the tendency to disproportionately value objects we assemble ourselves) may not matter so much in healthcare, there are some that are particularly illuminating.
In their book Your Medical Mind: How to Decide What Is Right for You, Jerome Groopman, MD and Pamela Hartzband, MD underscore how much mindset matters in healthcare. Among numerous other interesting cognitive preferences they explore, Groopman and Hartzband spend a good deal of time digging into six common patient preferences that come into play during healthcare decision making. The idea isn’t that we fit squarely in one type or another, but that we fall along three continuums: maximalist-minimalist, believer-doubter and technologist-naturalist. Naturalists, for example, tend to believe that the body can typically heal itself, whereas technologists tend to latch onto the latest scientific advancement. Another interesting concept is “availability bias,” which is the idea that people have a tendency to overestimate the potential for something to happen if they’ve heard a vivid or emotionally charged example of it happening to someone else. Availability bias can lead to the type of thinking that drives fear of vaccinations or someone putting greater stock in the N of 1 example they heard from their grandmother or online rather than expert perspective or results from a clinical trial.
Just like stress and fear, cognitive biases interfere with rational judgment and place a highly personalized lens between the people we develop content for and the content itself. By being aware of the range of cognitive biases people bring to the table, we can better understand our target audiences. And by providing patients with tools for understanding their personal default settings, we just may be able to enhance their ability to navigate complex and often conflicting healthcare information.