top of page

False Dilemma in Leadership: Why Either/Or Thinking Shrinks Good Judgment

  • Writer: Joseph Conway
    Joseph Conway
  • 37 minutes ago
  • 6 min read
Realistic executive boardroom scene showing two opposing options and a third evidence-based path in the center, symbolizing false dilemma bias and cognitive flexibility.
The most dangerous trap in decision-making is not always the answer. Sometimes it is the question that made only two choices visible.

False Dilemma in Leadership: The Trap Is the Question


Someone walks into the meeting and says it like it is obvious.


Either we move fast, or we protect quality. Either we prioritize equity, or we reward merit. Either we lead with strength or with compassion. Either we trust faith, or we trust science.


The room tightens.


People start picking sides before they start thinking.


That is the trap.


Not reality. The shape of the question.


A false dilemma in leadership, also called a false dichotomy or either/or fallacy, presents two options as if they are the only options when other real options exist. In the research literature, it is tied to reasoning from incompatibility, in which people are pushed to treat a narrowed frame as the whole field.


Why this fallacy lands so hard


Because the brain likes speed.


Adele Diamond’s review of executive functions explains that cognitive flexibility is one of the core executive functions. It helps people think outside the box, see things from different perspectives, and adapt to changed circumstances. When that flexibility drops, thinking narrows.


That is not just theory. A 2018 Memory & Cognition study found that false-dilemma style reasoning showed up in a large share of participants, and people were less likely to fall for it when they could retrieve more alternative options from memory. Put plainly, the more “third options” people could think of, the harder the fallacy hit.


The pull is subtle, then suddenly heavy. It feels like clarity. It is usually compression.


Your brain is not broken. It is being efficient.


Binary thinking is fast. In an actual emergency, fast can save your life.


Threat or not. Run or stay. Move or freeze.


That is useful in danger. It is less useful in leadership.


Research on executive function ties cognitive control to the prefrontal cortex and distinguishes it from more automatic processing. When stress rises, people tend to rely more on fast, habitual patterns and less on flexible, goal-directed thinking.


So when a leader feels pressure, the brain often looks for the choice, not a wiser frame.


That is why false dilemmas thrive in tense meetings, culture fights, church conflict, crisis communications, board decisions, and performance conversations. Stress does not always sharpen thought. Sometimes it shrinks it.


The lie hidden inside “either/or”


The false dilemma fallacy does not always scream.


Sometimes it whispers in polished language.


“We can either be inclusive or be excellent.”

“We can either support employees or hold people accountable.”

“We can either honor tradition or face reality.”

“We can either protect relationships or tell the truth.”


That is not reasoning. That is stage design.


Two options get spotlighted. Everything else gets shoved backstage.


The fallacy works by foregrounding two options and backgrounding the rest until they nearly disappear.


And once that happens, smart people start fighting over a cramped map.


Why smart people fall for it


Not because they are foolish.


Because they are pressured.


The higher your role, the more social pressure there is to look decisive. Sometimes leaders are rewarded for speed so often that they stop noticing when speed becomes intellectual corner-cutting. Status can make it more costly to say, “I think the question itself is wrong.”


That is where metacognition matters. Cognitive bias research consistently shows that noticing your own thinking changes the quality of your decisions. Executive-function research points in the same direction: better inhibition and flexibility help people interrupt automatic responses and adapt more effectively.


In plain English, intelligence is not enough. You need enough self-awareness to catch your own mind reaching for the nearest box.


False dilemmas do special damage in trauma-shaped environments


This part matters.


For some people, either/or thinking is not just a logic problem. It is a survival pattern.


If you grew up in chaos, control, exclusion, racism, abuse, spiritual manipulation, or chronic instability, your nervous system may have learned that choices really were narrow. Speak or stay safe. Obey or get punished. Agree or get pushed out.


SAMHSA defines a trauma-informed approach as one that realizes the widespread impact of trauma, recognizes its signs and symptoms, responds by integrating that knowledge into practice, and actively resists retraumatization. SAMHSA also centers safety, trustworthiness, collaboration, empowerment, voice, and choice.


That means trauma-informed leaders do not use fake binaries like blunt instruments.


They widen the frame. They lower threat. They make room for voice. They restore choice.


That is not soft. That is what responsible power looks like.


The both/and is not weakness


Let’s kill a bad idea while we are here.


Both/and thinking is not indecision. It is not fence-sitting. It is not moral fog.


Sometimes life really is either/or. Some decisions are true trade-offs. Some doors close other doors. Reality can be hard like that.


But not nearly as often as people claim.


Often the third option is not compromise. It is reframing.


Not faith or science.

Try faith-informed values with science-informed methods.


Not equity or merit.

Try fair systems that let real merit become visible.


Not strong or compassionate.

Try clear expectations delivered with dignity.


Not speed or quality.

Try iteration with triage: move fast on what is reversible, slow down on what is not.


That move matters. The third option does not simply split the difference. It changes the question.


Neuroplasticity is the good news in this whole mess


The brain can learn.


A 2024 review in Frontiers in Human Neuroscience describes cognitive flexibility as the ability to adapt thinking and behavior to changing demands and notes its links to resilience, well-being, and better adaptation. The authors also review evidence that flexibility can be shaped through different kinds of assessment and intervention, though the field is still refining exactly how it should be measured.


That means false-dilemma thinking is not a fixed character flaw.


It is a habit. Habits can be trained. Neural pathways that get used get stronger.


The London taxi-driver example makes that point, and the broader research does support structural brain differences tied to intensive navigation learning. Studies comparing London taxi drivers with other drivers found greater posterior hippocampal volume in taxi drivers, a classic example of experience-linked brain change.


Different skill. Same principle.


Use the circuit. Strengthen the circuit.


Practice third-option thinking enough, and your mind gets quicker at spotting when a question is rigged.


How leaders can break the false dilemma in real time


Here is the practical play.


  1. Name the frame

    Say it plainly.

    “We are acting like these are the only two options.”

    That sentence alone can put oxygen back in the room.

  2. Ask, “What is missing?”

    Not which side do I prefer? What got left out?

    What assumptions are hiding underneath this framing? Who benefits if we accept this binary? What possibilities were never named?

  3. Force a third option

    Do not wait for one to appear like lightning from heaven.

    Require it.

    “This is framed as A or B. What’s C?” 

    That question does something powerful. It interrupts compliance.

  4. Regulate the nervous system

    You cannot do nuanced thinking while flooded.

    Sleep loss, chronic stress, pressure, and psychological threat all make narrow thinking more likely. Executive-function research and trauma-informed practice both point the same way: safety and regulation improve access to higher-order thinking.

    Sometimes the smartest move in the meeting is not a hotter argument. It is a slower room.

  5. Build a richer memory bank

    The Brisson study matters here. People who could retrieve more alternatives were less vulnerable to false dilemma reasoning. That means reading widely, listening across difference, and exposing yourself to more than one kind of framework is not just nice. It is cognitive armor.

    Diversity is not only moral. It is mentally useful.

  6. Reward people who widen the frame

    If someone says, “I think there may be another option,” and the room treats them like they just coughed in the communion cup, do not expect innovation later.

    Psychological safety research has long shown that teams learn better when people can raise problems, ask for help, share information, and talk about errors without being punished for it. Google’s team-effectiveness work likewise identified psychological safety as a key condition for strong team performance.


If leaders punish nuance, teams will perform certainty instead.


What this means for leaders


Leaders set the mental weather.


If you frame every hard issue as two opposing camps, do not be shocked when your culture turns tribal. If you reward fast answers over better questions, do not be shocked when people stop thinking out loud. If you corner people with false choices, do not call their shutdown “lack of engagement.”


Sometimes the biggest bias in the room is not in the answer.


It is in the menu.


The question itself might be the trap. 


That is the work.


Not just choosing faster. Framing better.


Call to action for leaders


For the next 30 days, make this a rule in every major decision meeting:


No binary gets accepted until the room generates at least one serious third option.


Not a joke option. Not a token option. A real one.


Then ask:

  • What did the original frame hide?

  • What fear made the binary attractive?

  • What would a safer, wiser, more honest version of this question sound like?


Do that long enough and something starts to change.


Your meetings get less reactive. Your people get less performative. Your thinking gets less brittle. Your culture gets harder to manipulate.


And that is the point.


Because false dilemmas do not just weaken decisions. They shrink human possibility.


Leaders are supposed to do the opposite.

bottom of page