• Sustainability
  • DE&I
  • Pandemic
  • Finance
  • Legal
  • Technology
  • Regulatory
  • Global
  • Pricing
  • Strategy
  • R&D/Clinical Trials
  • Opinion
  • Executive Roundtable
  • Sales & Marketing
  • Executive Profiles
  • Leadership
  • Market Access
  • Patient Engagement
  • Supply Chain
  • Industry Trends

Use Behavioral Science for Better Launch Success

Article

Understanding of human behavior can aid in avoiding pitfalls during launch.

Jacob Braude

Jacob Braude

Much of what goes into a successful launch—organizational structure, regulatory pathways, cross-functional alignment, customer and stakeholder insights, etc.—has one key thing in common: dependency on human behavior. Take a minute to think about a few launches that you have first or secondhand knowledge about. I’ll bet you can think of a behavioral barrier. Maybe it was a customer segment who said one thing in research but did something else in the market. Or maybe it was a team member who made decisions that proved costly, even though they had data that told them otherwise. Perhaps a team structure put people at cross-purposes and hampered your ability to make agile decisions.

All of these detrimental behaviors have a logic, but all of them are avoidable. Behavioral science can tell us how to avoid these pitfalls and how we can build and navigate our launch projects more successfully.

Where do these negative behaviors come from?

It starts, as many things do, with resource constraints. Rational decision-making takes a great deal of time and resources. Our brains simply can’t handle making all of our everyday decisions solely using this system. We’d bog down and grow exhausted. To compensate, we use another system in the background to aid our decision-making, a system designed for efficiency and speed. This second system uses automation and simple rules to speed things up. The problem arises in the bridge between systems. While there is logic behind your automated system, the logic doesn’t come across to your rational brain. Instead, you get an impulse or an emotion. And when these mental shortcuts are used in the wrong situations, they create biased thinking—otherwise known as “cognitive bias.”

Reciprocity bias is one of these mental shortcuts. Reciprocity bias is when you feel an urge to reciprocate if someone does something for you, even if you didn’t ask them to or didn’t want them to. For example, in a test, researchers got more than double the response on surveys if they added a hand-written note asking for the recipient to fill out the form. Many recipients felt they needed to reciprocate because someone had taken the time to write a request.

Another of these mental shortcuts is called the affect heuristic: The way you feel when making a decision can impact the decision you make. For example, researchers offered people the choice of a computer program or a music album as a reward. They found that people were about half-and-half in their choices. However, when they “accidentally” gave people both rewards and then told them they would have to give one back (introducing a negative affect) people overwhelmingly chose to keep the music album. People went for the feel-good choice when they were in a bad mood.

Now think back: How many times were people tired, grumpy or frustrated during a launch? How many of those feelings subtly tilted decisions to the easy or comfortable choice? Or, how many customers told you they saw strong demand for your new product, but then defaulted to writing the status quo? Some of that could be driven by reciprocity they felt—even subconsciously—toward the established brands.

How these behaviors affect healthcare

In a test we did last year for a pharma client, we told pulmonologists and cardiologists to imagine that they had diagnosed a patient with a progressive, terminal disease. We told them that in this situation, they thought the patient should start treatment as soon as possible to slow the disease, but the patient was hesitant. Then we asked: How likely are you to push the patient to follow your clinical opinion and start treatment? Here’s the catch: half of the physicians we also told that, at the end of the exam, the patient shook their hand and gave them a heartfelt “Thank you” for their care. Those physicians who were exposed to a gratitude trigger were significantly less likely to say they would push the patient to treat. The need to reciprocate the warm feeling from the patient’s gratitude led them to be less forceful in advocating for their clinical opinion.

3 steps toward recognizing cognitive barriers

Launching a product successfully is hard enough without having to navigate these rules blindly—both with your teams and with your customers. Here are some steps you can start taking today to shed some light on these invisible barriers:

  • Know thine enemy: The first step is to know what you’re looking for. There are several great resources freely available to help you and your teams with the rather long list of mental shortcuts that have been well defined in the academic literature. The Cognitive Bias Codex is a great visualization of how the biases cluster when you think about what causes the bias to appear (advanced hint: biases cluster differently when you think about them in other ways, like how they’re used strategically in communications). The next resource you should share with your teams is the list of cognitive biases on Wikipedia. This will help you understand the basic mechanics behind each bias, and also give you a sense for just how much we rely on automation to aid our thinking.
  • Don’t be biased about biases: The second step isn’t obvious—but it is critical. Many biases will seem applicable to your situation, but they are not. We’ve run thousands of bias tests on patients, healthcare professionals and employees across all kinds of behaviors; typically, between half and two-thirds of the tests fail to show that the bias actively influences the behavior. In the tests, the bias triggers didn’t prompt a change in decision or behavior. What’s worse—sometimes the bias trigger backfired—driving people away from the desired behavior. The moral of this story is clear: Be scientific. Be prepared to test and learn. Look for evidence that the bias is influencing decision-making and try some different interventions.
  • Judge not: The last step I’ll leave you with is important. People hear the term, “cognitive bias” and think “bad,” especially if they’re hearing it in reference to their work or their clinical decisions. Cognitive biases are not bad; they just are. Every human on this planet relies on automation to aid their decision-making. Learning about cognitive bias is an opportunity to improve decision-making and, ultimately, improve outcomes for your teams and for patients. As you introduce behavioral science to your launch team, take the time to frame it properly, and you’ll see an engaged and eager team rising to the occasion.

Companies are beginning to understand the ways in which cognitive bias influences their outcomes. Identifying and mitigating the biases your teams have developed—on their own or as a group—is critical to making efficient and effective decisions. Likewise, understanding the biases inherent in a new customer base before bringing the product to market and the role each bias plays in how people react to a new product entering the market are critical to rapid uptake. Those teams who seek to understand behavioral science will outperform those who do not. The sooner you get started, the sooner your launch teams will start to reap the benefits.

Jacob Braude is the leader of ZS’s applied behavioral science team.

Recent Videos