top of page
Search

Clarity in Uncertainty, Ambiguity Through Behavioural Economics

Updated: Aug 12

What is the difference between uncertainty and ambiguity to you?


Under the framework of behavioural economics, there is a concrete way to define each. The distinction also parallels the kind of structured thinking that I aim to cultivate in my conversations with students. The kind of thinking that goes beyond “I know this” or “I do not know this.” Instead, we pinpoint exactly where understanding needs improvement.


When students face a new problem, “I know this” or “I do not know how to do this” is not enough for us. I want them to identify the exact point where their reasoning starts to break down. One tool I use is a 0 to 10 scale. Zero means “I know how to do this, just give me two minutes to work through it.” Ten means “I do not recognize a single term here, where do I begin?” That is my cue to start introducing new concepts. Somewhere in between, five means “I recognize some terms, or maybe all of them, but I cannot connect the dots yet. Let me try.” This spectrum mirrors the distinction between uncertainty and ambiguity.


Ellsberg Paradox

Uncertainty is like drawing a ball from an urn, knowing there are exactly 50 red and 50 black balls inside. We do not know which colour we will draw, but we know the odds, much like flipping a fair coin. Ambiguity, on the other hand, is when we know there are 100 balls in the urn, all of them either red or black, but we do not know the ratio. It could be evenly split like 50:50, or it could be 99 red and 1 black, or anything in between. We simply do not know.


A level 10 problem often feels like ambiguity. It is just a block of text with no foothold and no clear way to begin. A level 5 problem feels more like uncertainty. We know that we do not know the answer, but we have enough information to start guessing, making educated guesses, and eliminating options in a multiple choice setting. With ambiguity, the first step is to define the terrain, to start with answering what we do not know. For example, what is an exponential function. With uncertainty, we can move directly into strategy and start eliminating answers that do not feel right in an exam environment.


This distinction is central to one of the most famous experiments in behavioural economics, the Ellsberg Paradox. Daniel Ellsberg, an American economist and a former military analyst, designed a thought experiment showing that people tend to prefer known probabilities over unknown probabilities, meaning that we do not know what the ratios are between red and black other than the fact that there are 100 of them, even when logic suggests that they should be indifferent. His 1961 paper on ambiguity aversion has been replicated in many academic settings, and the pattern shows up everywhere, from how investors choose between asset classes, to how managers evaluate incomplete market data, to how we decide whether to try a new restaurant without reviews.


Perhaps your gut tells you that trying a new restaurant without reviews is risky, but the formal structure here suggests there is no statistical difference. Yet those with a curious mindset who are eager to try new things are often rewarded with fresh experiences, at least in learning.


Tolerance of the Unknown

In my conversations with students, I encourage them to test the limits of their thinking and stretch their imagination. In a warm, encouraging environment, they learn that small mistakes are not only acceptable but also essential to growth. This stretches their mental resilience over time.


In earlier posts on mental resilience and thinking out loud, I described how creating a safe space for exploration changes how students approach problems. They begin to see the upside of mistakes as part of the learning process. This mindset also applies in high stakes, time limited tests. Sometimes, taking a calculated risk is the rational move. And a rational move, in this context, is one where the potential upside outweighs the potential downside, based on the odds and the scoring rules. For example, if we can eliminate one option in a four choice, multiple choice question, and the penalty, being the downside of guessing wrong, is outweighed by the potential score from guessing right, then math suggests that we should take the gamble. The scoring rule can change, but the fundamental idea is that we know what the expected values are and we use that to guide our decisions.


Decision Making Under Uncertainty

If you are curious about decision making under uncertainty, behavioural economics has uncovered many other biases. One example is optimism bias, the tendency to lean towards the upside, focusing more on what could go right than what could go wrong. This is where it helps to recall a few formal terms. A risk neutral decision maker evaluates choices purely on their expected value, which is the weighted outcome you could get if you repeated the decision many times. A risk averse person prefers a certain outcome over a gamble with the same expected value, while a risk loving person prefers the gamble with the same expected value. As you can see, how we design risk preferences in this case is that we hold the expected value constant. In fact, in behavioral economics, we can measure the degree to which people prefer taking risks, meaning the gamble, over something that is fixed. Biases like optimism bias describe ways we deviate from this neat framework, and behavioral economics at large uncovers the kinds of irrational behaviors that deviate from the mathematically correct answer.


While that might sound irrational in textbook terms, in life, it often drives collaboration, creativity, and the kind of positive “what-if”s that keep people trying new things and learning from that process. Just like in our ambiguity and uncertainty spectrum, the way we face the unknown shapes not only our decisions but also our growth.


Whether we begin by defining the terrain or move straight into strategy, each choice is a chance to sharpen judgment, deepen understanding, and step into a new world to be explored.



References


  1. Daniel Ellsberg – Biography overview:


    Ellsberg, Daniel. Encyclopaedia Britannica. Retrieved from https://www.britannica.com/biography/Daniel-Ellsberg

  2. Ellsberg’s Paradox – Original paper:


    Ellsberg, D. (1961). Risk, Ambiguity, and the Savage Axioms. Quarterly Journal of Economics, 75(4), 643–669. https://doi.org/10.2307/1884324

  3. Optimism Bias – Cognitive bias research:


    Sharot, Tali. (2011). The Optimism Bias: A Tour of the Irrationally Positive Brain. New York: Pantheon Books. Summary at https://en.wikipedia.org/wiki/Optimism_bias


 
 
 

Comments


bottom of page