top of page

The invisible barriers: Cognitive bias in the world of innovation


Wichtigkeit des Innovationsmanagement

Image: No, I don't have any bias, the others do


In the dazzling world of innovation and creativity, we have an almost infinite opportunity for progress and technology. However, as we strive to constantly expand the boundaries of what is possible, we often remain blind to the invisible barriers that shape our thinking processes. These invisible barriers are cognitive biases, deeply ingrained thought patterns that unconsciously influence our decisions and judgments.


Innovation processes, whether in business, technology, or society, are not immune to these biases. On the contrary, they can significantly impact creative development by blocking the path to new ideas and solutions. From confirmation bias and anchoring bias to cultural and gender bias, these cognitive distortions play a significant role without us consciously realizing it.


A non-exhaustive overview of types of cognitive biases:


1. Confirmation Bias:

Example: A researcher who only looks for studies supporting their hypothesis and ignores those contradicting it.

Reference: Nickerson, R.S. (1998). Confirmation Bias: A Ubiquitous Phenomenon in Many Guises. Review of General Psychology, 2(2), 175–220.


2. Groupthink:

Example: A team of engineers that, due to pressure from management or their own technological enthusiasm, neglects to discuss an innovative or less popular solution.

Reference: Janis, I. L. (1972). Victims of groupthink. Boston: Houghton Mifflin.


3. Anchoring bias:

Example: A team member mentions a high price of a premium smartphone as a reference point for the selling price, influencing the perception of the team's price.

Reference: Tversky, A., & Kahneman, D. (1974). Judgment under Uncertainty: Heuristics and Biases. Science, 185(4157), 1124–1131.


4. Status-Quo bias:

Example: An organization resisting the adoption of new technologies because they prefer the current way of working.

Reference: Samuelson, W., & Zeckhauser, R. (1988). Status quo bias in decision making. Journal of Risk and Uncertainty, 1(1), 7–59.


5. Availability heuristic:

Example: In product development, teams tend to favor proven ideas from past projects due to the availability heuristic, even if new innovative approaches might be more effective.

Reference: Tversky, A., & Kahneman, D. (1973). Availability: A heuristic for judging frequency and probability. Cognitive Psychology, 5(2), 207–232.


6. Framing effect:

Example: The successful launch of a new product is influenced by the framing effect when presented as "revolutionary" rather than "evolutionary," positively influencing the perception of innovation.

Reference: Tversky, A., & Kahneman, D. (1981). The Framing of Decisions and the Psychology of Choice. Science, 211(4481), 453–458.


7. Overconfidence bias:

Example: Overconfidence bias can occur in innovation when a technical product developer is convinced that their innovative idea will overcome all market challenges, even without talking to users and considering potential obstacles.

Reference: Lichtenstein, S., Fischhoff, B., & Phillips, L. D. (1982). Calibration of Probabilities: The State of the Art to 1980. Heuristics and Biases: The Psychology of Intuitive Judgment, 306–334.


8. Technology bias:

Example: Assuming that a newer software version is automatically better without checking its actual value for users.

Reference: Moore, G. C., & Benbasat, I. (1991). Development of an instrument to measure the perceptions of adopting an information technology innovation. Information Systems Research, 2(3), 192–222.


9. Cultural bias:

Example: Evaluating product features based on Western cultural preferences without considering the diversity of global markets.

Reference: Hofstede, G. (1980). Culture's Consequences: International Differences in Work-Related Values. Sage.


10. Gender bias:

Example: Underestimating the technical abilities of a female engineer due to gender stereotypes.

Reference: Eagly, A. H., & Karau, S. J. (2002). Role Congruity Theory of Prejudice Toward Female Leaders. Psychological Review, 109(3), 573–598.


11. Attribution error:

Example: Attributing poor performance of a team member to their lack of skills instead of the challenging task.

Reference: Ross, L. (1977). The Intuitive Psychologist and His Shortcomings: Distortions in the Attribution Process. Advances in Experimental Social Psychology, 10, 173–220.


12. Self-Serving bias:

Example: A team attributing project success to their own skills, while blaming external factors for failures.

Reference: Miller, D. T., & Ross, M. (1975). Self-serving biases in the attribution of causality: Fact or fiction? Psychological Bulletin, 82(2), 213–225.


13. Hindsight bias:

Example: Team members believing, after completing a project, that its success was predictable, despite not considering uncertainty during planning.

Reference: Fischhoff, B. (1975). Hindsight ≠ Foresight: The effect of outcome knowledge on judgment under uncertainty. Journal of Experimental Psychology: Human Perception and Performance, 1(3), 288–299.


14. Sunk-Cost problem:

Example: A company decides to continue a failed project to justify the already invested resources, even though it would be better to abandon it.

Reference: Arkes, H. R., & Blumer, C. (1985). The psychology of sunk cost. Organizational Behavior and Human Decision Processes, 35(1), 124–140.


15. Timing or synchronization bias:

Example: In the development of Artificial Intelligence (AI), timing or synchronization bias may occur when a specific AI product enters the market amid widespread hype around AI technologies. Public attention and interest in AI could falsely make the product appear groundbreaking and more successful because it is synchronized with the general AI enthusiasm. This bias could lead to overrating the actual performance of the AI product compared to other innovations that may not benefit to the same extent from current attention.

Reference: Tversky, A., & Kahneman, D. (1973). Availability: A heuristic for judging frequency and probability. Cognitive Psychology, 5(2), 207–232; Kahneman, D., & Tversky, A. (1979). Prospect Theory: An Analysis of Decision under Risk. Econometrica, 47(2), 263–292.


It is important to note that the mentioned references are some of the foundational works discussing the concepts behind each bias. There are many more research papers and articles delving into these phenomena, but they would exceed the scope of this article.


Yetvart Artinyan

P.S: Do you want to know more about how to make your innovation project successful and avoiding typical pitfalls?

  1. Extend your team and knowledge on a temporary or permanent basis: Contact me for a conversation.  

  2. Transfer the knowledge: Book one of the innovation bootcamps 

  3. Get a keynote on this topic for your organization: Book a keynote now

3 views

Comments


Commenting has been turned off.
bottom of page