Philosophy

A 'Crash Course' in Critical Thinking

I have decided to consolidate and repeat my “Critical Thinking” series in consecutive posts over the next couple weeks so that anyone interested can follow the posts and walk away from it with a greater appreciation of the value of critical thinking in our everyday lives.
 
This series is only a high-level overview of what critical thinking is. It’s not a workshop on how to master the skills of critical thinking. In the weeks ahead I will be summarizing the art and science of ‘critical thinking,’ and its role in positing, critiquing, and evaluating truth claims.
 
There are several valuable resources available for going deeper. Where to start? Here is a link to a helpful, low-cost introduction to critical thinking for anyone interested.
 
 
To get the most out of the series, you may follow the posts in the order presented (chronologically), and join in the discussion on my Facebook page. Click the links below to pick up where you left off.

Day:    1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34

1. "Why Critical Thinking Matters"

This is the 1st installment of my “Crash Course in Critical Thinking” series. It identifies the goal of critical thinking, and why it matters.

2. "Perception is Reality"

 
This is the 2nd installment of my “Crash Course in Critical Thinking” series. It deals with the popular myth that “perception is reality.” 
 
From the time of the ancient Greeks, we’ve known that perception alone is a woefully inadequate benchmark for uncovering reality.
 
Perception (the five senses) must always be tested against the available evidence to validate or dismiss the truth claim behind the appearance. 
 
This is the first step in critical thinking.

3. "Reasoning"

It deals with the role of reason in working through problems and in sorting through truth claims.
 

4. "Problem Solving"

This is the 4th installment of my “Crash Course in Critical Thinking” series. It describes how the disciplined application of critical thinking can both simplify and enable constructive problem solving.

This series is only a high-level overview of what critical thinking is. It’s not a workshop on how to master the skills of critical thinking. There are several valuable resources available for going deeper.

5. "Operating Assumptions"

This is the 5th installment of my “Crash Course in Critical Thinking” series.

We all operate on an unconscious platform of assumptions and biases, many or most of which are unexamined and merely accepted as given. Critical thinking aims to correct this flaw in our approach to reasoning, problem solving, belief formation, and decision-making.

6. "Truth's Best Friend"

This is the 6th installment of my “Crash Course in Critical Thinking” series.

Critical thinking is about getting at the truth. While the truth comes as an enemy to those who refuse to greet it as a friend, it comes to no one who has not developed the skills of following the evidence wherever it leads.

7. "The Anchoring Bias"

This is the 7th installment of my “Crash Course in Critical Thinking” series.

Ever notice how, as a species, we just seem to go nonchalantly along with the latest “news” or idea, the most current fad or podcast, the most recent movement or campaign… with very little question? It’s called “anchoring bias” in critical thinking. It’s one of the biases critical thinking aims to relieve us from.

8. The "Sunk Cost" Fallacy

This is the 8th installment of my “Crash Course in Critical Thinking” series. It is closely related to the “anchoring bias” we covered in the last installment. Awareness of the fallacy and how to take control of it can save us from ultimately feeling overwhelmed and defeated.

We often find ourselves investing time, money, emotional energy, even our reputations and relationships, in causes, dreams, programs, or sure-fire schemes without ever really thinking it through… until we wake up one day to realize we are over our heads but feeling like we are in too deep to do anything about it. This is called the “sunk cost fallacy” in critical thinking. And critical thinking shows us how it’s rarely too late to turn things around.

9. The "Halo Effect" Bias

This is the 9th installment of my “Crash Course in Critical Thinking” series.

Movie stars, sports heroes – indeed celebrities of all kinds, including popular politicians (popular often for wrong reasons, or popular with certain groups), trendy talking heads and media gurus, and even cult leaders, all share one thing in common: they possess an aura, a charisma their followers find irresistible. And there’s the rub.

No matter how popular or attractive, no matter how many followers they have or how many “Likes” on Facebook, critical thinking teaches us to always question, to always scrutinize and test the claims they make, the assumptions they embrace, and the motivations that animate them. How? By learning the basic skills of critical thinking.

This series is only a high-level overview of what critical thinking is. It’s not a workshop on how to master the skills of critical thinking. There are several valuable resources available for going deeper. Where to start?

There are several valuable resources available for going deeper. Where to start? Here is a link to a helpful, low-cost introduction to critical thinking for anyone interested.

To get the most out of the series, you may follow the posts in the order presented (chronologically), and join in the discussion on my Facebook page.

10. The "Assumed Knowledge" Fallacy

This is the 10th installment of my “Crash Course in Critical Thinking” series.

Knowledge is a necessary cornerstone of reason, science, learning, self-improvement, and growth. We acquire knowledge through study, reflection, and applied experience. Once acquired, it becomes our possession.

Ask anyone how difficult it was for him or her to acquire their knowledge of, say, history or physics; of geometry or algebra; computer technology or video production; psychology, medicine, or law; of gardening, carpentry, or just about any art, craft or science we largely take for granted. And you are likely to hear it required time, effort, and disciplined practice, often over a period of years. Yet, once acquired, it became part of their DNA, as it does ours.

Remember this the next time you are engaged in teaching someone a new art or skill, a new form of abstract thought or theory, or engaged in dialogue or heated debate. Just because you now know what you didn’t know before you knew it is no justification for assuming the knowledge you now have is universally known or shared. That is the “assumed knowledge” fallacy in critical thinking.

11. The "Integrity Principle"

This is the 11th installment of my “Crash Course in Critical Thinking” series.

Integrity is defined as “moral soundness; honesty; unimpaired by corrupting influences.” This is both the condition and the outcome of authentic critical thinking.

One must possess integrity to pursue the evidence wherever it leads, even when it pits us against the maddening crowd. To pursue the truth through critical thinking, we must not only be willing to stand alone against the prejudice of unexamined popular opinion and beliefs; we must be equally willing to accept the results of our pursuit of the truth, even when the results turn out to undermine or entirely discredit our own cherished beliefs. It requires integrity to stand that ground in the face of opposition, uncertainty, and doubt.

12. The "Integrity Corallary"

This is the 12th installment of my “Crash Course in Critical Thinking” series.

In yesterday’s installment of the series we covered what I call “The Integrity Principle” of critical thinking. Integrity we might say is the engine of critical thinking; the desire for truth, its fuel.

Today, we carry the Integrity Principle one step further. Not only does integrity demand that we follow the evidence wherever it leads (even if it ends up undermining our own cherished beliefs or requires us to stand against the currents of popular culture), its corrolary demands that we defend unpopular opinion or truth claims when those claims have been rigorously investigated, tested, and found to be true.

13. "Skepticism & Objectivity"

This is the 13th installment of my “Crash Course in Critical Thinking” series. It identifies the two overarching principles of critical thinking: skepticism & objectivity.

Skepticism (as used here) is about entertaining a healthy and constructive suspicion of truth claims, data, or narratives that are presented baldly without falsifiable empirical evidence.

Objectivity is about the real world in which we live. A world where the laws of nature bend to no prescribed ideology or outcome. A world where truth claims can be tested without fear of corruption or mischief.

14. The "Self-Deluding" Bias

This is the 14th installment of my “Crash Course in Critical Thinking” series. It identifies one of the common foibles of human nature.

We have such a marvelously exaggerated estimation of ourselves. The reason adopting critical thinking as a way of life is so powerful is that it begins by focusing the searchlight of accountability on ourselves. It enables us to critically evaluate our own contributions to the results (good or bad) we produce.

15. The "Accountability Imperative"

This is the 15th installment of my “Crash Course in Critical Thinking” series. It follows from the last installment on the “self-deluding” biases we harbor, often unconsciously, and how critical thinking enables us to see these latent in ourselves.

It doesn’t take a genius or anyone skilled in critical thinking to hold others accountable. That comes naturally. But to hold ourselves accountable?

The first week of February 2021, I posted a series of musings on accountability, the first branch of which I defined as follows:

*  Accepting responsibility for our actions
*  Taking ownership of our mistakes
*  Being accountable for our results

Accountability is no stranger to the art of critical thinking. It is through the disciplined practice of critical thinking that we acquire a seasoned and mature awareness of our all-too-human limitations: our weaknesses as well as our strengths. It is only then that we can begin holding ourselves to account.

16. Distinguishing "Thought & Bias"

This is the 16th installment of my “Crash Course in Critical Thinking” series. It’s a further extension or alternate manifestation of the “self-deluding” bias identified in the 14th installment of the series posted on April 7, 2022.

Not one of us is immune from conscious and unconscious bias. It’s part of our DNA. Very few of us care to admit it, but that doesn’t change the fact we do. William James was right when he observed how we so often delude ourselves into thinking we’re thinking when all we’re really doing is rearranging our prejudices.

Critical thinking gives us the tools to critically examine our assumptions and thereby distinguish between our thought and our biases.

17. "Fast Thinking" vs. "Slow Thinking"

This is the 17th installment of my “Crash Course in Critical Thinking” series. It highlights the relationship between critical thinking and Daniel Kahneman’s “Thinking, Fast and Slow.”

Dr. Kahneman is an accomplished psychologist who also won a Nobel Prize in Economics for his work in “prospect theory” (making decisions in a climate of uncertainty). This makes him a worthy candidate for a 20th Century version of the Renaissance Man 🙂.

In the quote below, Dr. Kahneman identifies science as the search for objective truth: truth or reality that is mind-independant; that lies outside us. This he distinguishes elsewhere from subjective perception.

Decisions based on subjective perception or experience are more closely related to what he describes as “fast thinking,” which is quick, intuitive, and unreflective. Objective truth, on the other hand, is the goal of science and is more closely related to what he describes as “slow thinking,” which involves deliberate, conscious reasoning.

Critical thinking enables us to more easily distinguish between the two modes of thought. Most of the time in our everyday lives, “fast thinking” is all we need or want. But when important decisions must be made or serious action needs to be taken, it’s critical that we shift to “slow thinking.” This is where skill in critical thinking distinguishes itself.

For more about critical thinking and the purpose of this series, see my first post on this “Crash Course in Critical Thinking,” posted on March 26. To get the most out of the series, try following the posts in the order presented (chronologically).

Anyone interested in learning more about “fast thinking” and “slow thinking”, here is a link to Daniel Kahneman’s book “Thinking, Fast & Slow” on Amazon.

18. Asking the "Next" Question

This is the 18th installment of my “Crash Course in Critical Thinking” series. Perhaps more than anything else, getting this lesson, really getting it, will put you on the right path toward becoming a critical thinker.

Too often we simply accept at face value truth claims, nutty hypotheses, conspiracy theories, and a whole litany of misinformation we read or hear from friends, colleagues, family… or worse, our social media groups or “friends”, our echo chambers, and, yes, heaven forbid, even mainstream media, without question.

Next time someone makes a bold claim, in writing or in person, in a social media post or even in an article in the mainstream press, radio or T.V., ask yourself or them, “Where did you get that? What’s the evidence? Did you check the source? And what about that source’s source? How do you know that?” In other words, peel back the onion. Ask the next question, and the next one after that.

Will it make you a nuisance? You bet it will. But an informed, reliable nuisance others can depend on when the truth matters.

19. The "Why Not" Hypothesis

This is the 19th installment of my “Crash Course in Critical Thinking” series.

Yesterday we described knowledge as having the right answer; intelligence as asking the right question; and critical thinking as an instrumental extension of intelligence, i.e. always asking the next question. I also included a number of questions we might consider asking in our pursuit of knowledge, and validation of truth claims.

Today we go one step further. Assuming we have asked all the “How do you know that?” and all the “Why?” questions, what then? To test the hypothesis, we need to go one step further and ask, “Why not?” This last question will force us to examine our own assumptions and cognitive biases; it will encourage us to query our own unconscious biases; and finally, it will enable us to scrutinize the hypothesis or truth claim still more rigorously.

20. "Ockham's Razor"

This is the 20th installment of my “Crash Course in Critical Thinking” series. It offers insightful advice worth its weight in gold for anyone willing to take it seriously.

William of Ockham was one of the Middle Age’s leading empiricist thinkers (long before Locke, Berkeley & David Hume). He was not a determinist. He believed that the natural order of things does not imply the inevitability of any predetermined outcome. We have to look at how things actually are (not at what we suppose they are or must be).

What he is best known for is a principle of reasoning that has come to be known as “Ockham’s Razor”. It argues that where two or more competing or alternative explanations exist for any given phenomenon or state of affairs, the simplest one (the one with the fewest assumptions) is more likely correct.

21. "Hanlon's Razor"

This is the 21st installment in my “Critical Thinking” series.

Yesterday I posted about one of critical thinking’s secret weapons: Ockham’s Razor. Today we post on another Razor: Hanlon’s Razor.

While no doubt tongue in cheek, Hanlon’s Razor exposes an all too common flaw in the “fast thinking” assumptions we all too often adopt. We are often far too quick to attribute to malice or sinister motives behaviour of others we find peculiar, behaviour that is often the result of unthinking or unexamined assumptions, or, yes, sometimes, outright stupidity.

It’s always smart to ask ourselves the question. This too is a useful strategy in the critical thinkers arsenal.

22. The "Just World" Fallacy

This is the 22nd installment of my “Crash Course in Critical Thinking”. Today we tackle a truism many of us have a hard time accepting.

For some reason we often assume the world owes us a living, a “just” share of outcomes regardless of our individual effort, merit, or contribution… that any inequality of outcome is or must be the result of some form of injustice, bias, or cruel joke. It’s easier to accept such a rationalization for unequal outcomes than acknowledging that sometimes it might just have something to do with us. It might not be a cosmic or systemic conspiracy.

Critical thinking enables us to face the world as it is, to acknowledge that there are not always clean, simple answers to complex dilemmas or to troubled personalities. It behooves us to always reserve judgment until we’ve worked out the causes, and have engaged in sober self-reflection.

23. "Arguing from First Principles"

This is the 23rd installment of my “Crash Course in Critical Thinking” series. It identifies an inescapable dilemma in critical thinking: How to critically analyze and validate first principles? And whether that is even feasible, or desirable?

Sometimes referred to as self-evident truths, as axiomatic, or as “clear and distinct ideas” (René Descartes), first principles are the foundational baseline of reasoning, behind which we cannot productively go. They are not deducible from prior hypotheses or axioms. They stand on their own.

One example, borrowed from Aristotle, is that good is always to be preferred over evil. This is an axiomatic, almost indisputable principle (defining what we mean by “good” and “evil” is where the controversy resides; not the principle itself). Another is to “Do no harm”. And still another: the “Golden Rule” itself. All of these are common examples of first principles that are not readily deducible from ancillary hypotheses.

So yes, critical thinking makes peace with first principles by recognizing that hypotheses must be drawn from somewhere; that it is always useful and necessary to question sources and test hypotheses. But not indefinitely. One of the virtues of critical thinking lies in its ability to recognize the futility of deduction where the inescapable result of that pursuit is eternal regress.

24. "The Key to Reliable Results"

This is the 24th installment of my “Crash Course in Critical Thinking”. Today we examine why we so often fail to produce the results we are looking for, despite our best efforts.

It often boils down to this: method. Possessing the knowledge or training, even the skills and experience, are often not enough. More often than not, it’s how we apply that knowledge or training, those skills and experience, that marks the difference between success and failure.

Critical thinking is a disciplined method, a reservoir of tools, attitudes and cognitive techniques (including, for example, Ockham’s Razor) for identifying root causes, unexpected curve balls and/or opposition, that all too often seemingly conspire to thwart genuine progress and reliable outcomes. All other things being equal, he or she who has learned the principles of critical thinking will always be ahead of he or she who hasn’t.

25. The "Never Quit" Fallacy

This is the 25th installment of my “Crash Course in Critical Thinking” series. It’s one few people would associate with critical thinking. Yet, if we recall that critical thinking begins with a sober, critical awareness of ourselves, the pieces fall into place nicely.

Critical thinking is not about solving every problem, winning every argument, or acing every project. It’s about questioning every assumption, testing every hypothesis, and falsifying (or seeking to falsify) every claim. This includes the claims we take on board about ourselves, our insights & capabilities.

The “Never Quit” fallacy is a reminder that critical thinking and a sane estimation of ourselves are not mortal enemies. Quite the opposite. Critical thinking, rightly understood, engenders humility. It reminds us that we are human, and that sometimes the problem can’t be solved, the argument can’t be won, the project can’t be aced… or at least not now. It reminds us that sometimes “enough is enough”, and that’s okay.

26. The "Dunning-Kruger" Effect

 

This is the 26th installment of my “Crash Course in Critical Thinking”. Today we focus on one of the cognitive biases that is so subtle even Charlie Munger misses it in his list of 25 cognitive biases: the “Dunning-Kruger” Effect.

But first, what are “cognitive biases”? There are several popular, technical definitions one can look up. My definition, distilled from a long history of learning & applying critical thought, is this: they are the biases (assumptions or inclinations), often unconscious, that govern, drive or influence our decisions or reactions.

The “Dunning-Kruger” Effect is one I find particularly intriguing and relevant today. The most glaring example that comes to mind is the contemporary social media phenomenon where people, without having studied an hour of epidemiology, biochemistry or law, are instantly transformed into “experts”, knowing more than the scientists and other experts who have devoted their lives to studying the root causes and operations of the phenomena or issues under immediate consideration.

And how have these people become overnight “experts” on questions they have never studied formally or in any depth? From friends who “heard” or “read” it somewhere, of course. On Facebook, or Twitter, or YouTube… This is the stuff of the “Dunning-Kruger” Effect. Yet, just a moment of detached reflection will show that facts and science and study, and the rigorous application of critical thinking, still matter when the goal is the honest search for truth and reliable results, and not the garnering of political or ideological points.

27. Recognizing Confirmation Bias

This is the 27th installment in my “Crash Course in Critical Thinking” series. It focuses on perhaps the most common unconscious bias shared by people from all walks of life across our culture today, including politicians, lawyers, journalists, professors and students, consultants, politicians, administrators… even scientists, researchers, and engineers (did I mention politicians?): confirmation bias.
 
Unlike many or most unconscious biases (which are often driven subconsciously by politics, ideology or religion), confirmation bias is more subtle. It is grounded in our very human desire or felt “need” to be right, to be affirmed.
 
This desire, this felt need to be right, drives us (often unconsciously) to seek confirmation of our belief or opinion, and blinds us to evidence or rational arguments that might disturb or challenge our settled assumptions. Whenever we feel uneasy or uncomfortable about some idea or proposition that runs against one of our cherished beliefs or opinions (but don’t know why), and instead of taking time to critically assess that cherished belief or opinion against the threatening idea or proposition, we set out to find evidence to support the cherished belief or opinion, we can know with a moral certainty that “confirmation bias” is at work.
 
Critical thinking enables us to recognize confirmation bias, and gives us both the tools and the confidence to follow the evidence wherever it leads.
 
******************************************
 
Karl Popper, one of the 20th Century’s leading philosophers of science believed in the value and necessity of seeking to falsify hypotheses rather than to confirm them. This put him at odds with the verification principle espoused by Morris Schlick, Rudolph Carnap, and the logical positivists of the Vienna Circle in the 1920s & 30s. Here is his take on confirmation bias:
 
“The discovery of instances which confirm a theory means very little if we have not tried, and failed, to discover refutations. For if we are uncritical we shall always find what we want: we shall look for, and find, confirmation, and we shall look away from, and not see, whatever might be dangerous to our pet theories. In this way it is only too easy to obtain what appears to be overwhelming evidence in favour of a theory which, if approached critically, would have been refuted.”
Karl Popper, The Poverty of Historicism.
 
This series is only a high-level overview of what critical thinking is. It’s not a workshop on how to master the skills of critical thinking. There are several valuable resources available for going deeper. 

28. The Falsification Imperative

This is the 28th installment of my “Crash Course in Critical Thinking”. It follows as a logical extension from yesterday’s post on confirmation bias.
 
We spoke yesterday of the “felt need” to be right, to be liked and respected for our opinions and beliefs, and how the very natural, human tendency is to seek confirmation to vindicate those beliefs and opinions. Yet this is exactly the wrong approach if getting at the truth is our goal. Why? Because our beliefs and opinions (truth claims) can be, and often are, wrong, or at least not entirely correct. And if they are wrong (or not entirely correct), how are we to know if we simply fall into the trap of seeking evidence or arguments that serves only to confirm that belief or opinion?
 
The answer, as Karl Popper, one of the 20th Century’s leading philosophers of science, powerfully argues, is to seek to falsify the belief, opinion or truth claim, rather than to confirm it. One can “confirm” an hypothesis 100 times, or 1000 times, in science and still fall short of certainty that the hypothesis is correct or true. Falsify it once, that is all you need to know it was neither true nor entirely correct.
 
Popper believed in the value and necessity of seeking to falsify hypotheses rather than to confirm them. This put him at odds with the verification principle espoused by Morris Schlick, Rudolph Carnap, and the logical positivists of the Vienna Circle in the 1920s & 30s. Here is his take on falsification from his book “The Logic of Scientific Discovery”:
 
“The point is that, whenever we propose a solution to a problem, we ought to try as hard as we can to overthrow our solution, rather than defend it. Few of us, unfortunately, practice this precept; but other people will, fortunately, supply the criticism for us if we fail to supply it ourselves.”
 
This is the “falsification” imperative, one of the central lessons of applied philosophical reason (along with his doctrine of demarcation) that we owe to Karl Popper, and one of the most effective tools in the critical thinking arsenal.

29. The Data Imperative

This is the 29th installment of my “Crash Course in Critical Thinking” series. It follows logically from the last two posts, and from the influence of Karl Popper on my own approach to thinking.
 
On April 23rd we looked at “confirmation bias”; on April 24th we examined the “falsification imperative”. One inescapable conclusion emerges from these posts: You can’t “confirm” a belief or truth claim without appealing to an outside source; you can’t “falsify” a hypothesis or empirical claim without relying on relevant data. Data, i.e. sound, impartial, rigorously tested data, is the sine qua non of critical thinking.
 
Yet data alone, without context or applied reason, is a dead letter. Data must be sought and researched, its sources examined and questioned; and if found presumptively reliable, critically analyzed… then rejected, modified, or adopted, in whole or in part. Only then can you claim with any authority to have provisionally established your hypothesis.

30. The "Strawman" Gambit

This is the 30th installment of my “Critical Thinking” series. It zeroes in on one of the more disingenuous gambits that a person engaged in debate can default to when he or she find themselves on the losing end of an argument (rather than facing up to the real possibility they just might be wrong).
 
It involves the deliberate misrepresentation or exaggeration of an opponent’s argument. It is on a par with pesky “red herrings” (irrelevant information or side issues introduced into a dialogue for the purpose of distracting the parties from examining the real issue).
I’ve run into the “strawman” gambit a few times recently with people who don’t share my views or perspective about certain political or cultural issues, including:
 
  • The necessity of reasonable Covid-19 protocols, vaccinations and mandates;
  • Whether the 2020 U.S. election was “stolen” from Donald Trump;
  • The recent occupation of the Nation’s Capital by “protestors” who had little to no regard for the harm they were causing local residents & business owners by their actions;
  • The dangers of “Critical Theory” (not to be confused with “critical thinking”); and more…
I also run into it from time to time in my law practice. Wherever it rears its head, it is the art of the disingenuous; a refuge of the weak.
 
The key here is “deliberate” misrepresentation.
 
It is not the “Strawman Gambit” when the other *honestly* mischaraterizes the position he or she is attacking. But if he or she has not taken the time to consider and evaluate the argument they are attacking, and has elected instead to construct and attack a caricature of the other’s argument rather than face the uncomfortable prospect of being found to be wrong, he or she has sought refuge in the “Strawman Gambit”.
 
The opposite of the “Strawman Gambit” is the “Steelman Solution”. I will post on this one next time.
 
On Day 27 we looked at “confirmation bias”; on Day 28 we examined the “falsification imperative”. One inescapable conclusion emerges from these posts: You can’t “confirm” a belief or truth claim without appealing to an outside source; you can’t “falsify” a hypothesis or empirical claim without relying on relevant data. Data, i.e. sound, impartial, rigorously tested data, is the sine qua non of critical thinking.
 
Yet data alone, without context or applied reason, is a dead letter. Data must be sought and researched, its sources examined and questioned; and if found presumptively reliable, critically analyzed… then rejected, modified, or adopted, in whole or in part. Only then can you claim with any authority to have provisionally established your hypothesis.

31. The "Steelman" Solution

This is the 31st installment of my “Critical Thinking” series.
 
Unlike the “Strawman Gambit,” the “Steelman Solution” is the very antithesis of the “Strawman Gambit”. It is a far more constructive model of engagement when the parties are genuinely interested in knowing the truth as opposed to winning the argument or scoring points. Note, I said “constructive”, not easy.
 
One of the great models of “steelmaning” an opponent’s arguments was the 13th Century’s leading scholastic philosopher-theologian, St. Thomas Aquinas (in his “Summa Theologia”). It involved posing the question generically along the following lines, for example: “Whether the world has always existed?” (It always took the form of a “Whether…” question). Acquinas then put forward the strongest argument he could muster for his opponent’s position, as fully and fairly as he could. Only then would he attack the argument (often very differently than he might otherwise have done, understanding his opponent’s position far better now than he had before).
 
“Steelmaning” requires the discipline of applying the robust “critical thinking” skills we have been learning in this series since Day 1, starting with the awkward acknowledgment that we are not always right, that we can be wrong, that as ‘unlikely’ as it might be, we might actually learn something worthwhile from our opponent if we work at “steelmaning” his or her argument.
 
I said “not easy but constructive” earlier. Why? Because, although I am aware of its value to my own learning, I don’t always take the time to do this myself. And I know better! Often when I do take the time, I do it for more for strategic advantage (not to narrow the epistemic divide between myself and opponent or interlocutor).
 
Occasionally I refuse to steelman the opposing party’s argument because, well, I “know” from my own experience, research or reflection that I am “right” and my opponent is, well, “wrong”. And how do I “know” that? I don’t. Yet even then, I lose something. I lose the opportunity to learn more, to fill in the small, and occasionally large, spaces in my knowledge.
 
But perhaps the greatest loss of all is that I miss the chance of allowing my opponent to be heard. In that, we all lose.

32. The "Knee-Jerk" Effect

This is the 32nd installment of my “Crash Course in Critical Thinking”. It addresses a natural, very common reaction so many of us have when we feel our liberties being arbitrarily restricted. It’s a phenomenon experienced most widely in the Western world.
 
We’re spoiled. With very few exceptions, it is a phenomenon experienced only in the Western world. To us freedom or liberty is like the air we breathe. Try imposing arbitrary restraints on us and watch. There’s an almost automatic, “knee-jerk” reaction. When we feel this starting to take place, we would do well to pause and ask ourselves a few simple questions: Who is doing this to us? Why? And is it justified?
 
The “Who” question does not require critical thinking skills. It is normally a straightforward question of fact: A parent? A teacher? A police officer? The government? The “Why” question engages more the probative side of our psyche. While critical thinking skills are no doubt useful to the “Why” question, they are not necessarily required in most cases as the enquiry doesn’t necessarily require questioning of underlying assumptions. That falls more into the domain of the next question: “Is it Justified?”
 
To probe this question effectively does require the application of critical thinking because it requires a bringing together of the answers to all the preceding questions for critical analysis in context, including the rigorous questioning of assumptions; a recognition that the answer might be larger than ourselves; and willingness to critically examine our reactions in the face of the evidence.
 
Case in point: Covid-19 vaccine mandates and the so-called “Freedom” Convoy. The vaccine mandates and other protocols were introduced by government leaders around the world, based on the advice of scientists, researchers, and medical experts. They were introduced for public health and safety reasons. According to statistics tracked by John Hopkins University, as of
April 27, 2022, there have been over 512 Million infections worldwide, and over 6,240,000 deaths. These are very real numbers; very real lives lost and broken.
Were all the mandates and other protocols justified? No. Many were politically motivated. But diffentiating between those that were and those that were not justified requires critical analysis based on the data and the science. Reasonable constraints to protect vulnerable people from serious illness or death in a liberal democracy constitutes justification.
 
Anti-vaxxers and others gathered in Ottawa in the name of “Freedom”, claiming Charter rights. The problem is that most of them never read the Charter. They don’t understand the concept of “reasonable limits” on constitutional rights and freedoms. For this they can’t be faulted. Not everyone has had the benefit of a legal education (for which I am grateful). But for the following I do fault them: For not recognizing that freedom is not a license to do whatever we want, wherever we want, to whoever we want, without regard for the consequences, including the rights and freedoms of others.
 
Freedom implies responsibility, just as surely as rights imply duties. This seemed lost on them. No critical thinking anywhere to be found. Just a predictable, unexamined, “knee-jerk” reaction to perceived arbitrary constraints.
 
Critical thinking encourages and disciplines us not to automatically give in to these “knee-jerk” reaction. On the other hand, if following careful application of critical thinking and a fair review of the evidence we find the constraints on our liberties to be indeed arbitrary and not merely inconvenient, then the discipline and skills of critical thinking will equip us to challenge those constraints intelligently, and responsibly.

33. "It's Not About Feelings"

This is the 33rd installment of my “Crash Course in Critical Thinking”. It is also my second to last.
 
And I have only scratched the surface. Yet I hope for the few of you that have followed the series, I have whet your appetite enough to learn more about the art and science of critical thinking.
This post brings us back full circle to the beginning of our journey, and the gold thread that has animated the series throughout: that we ought never to accept at face value claims presented as true, without evidence or scrutiny, no matter how we feel about the truth claim itself, or how much it validates our beliefs or makes us feel good.
 
One of the pioneers of the 18th Century Western liberal Enlightenment was Denis Diderot (1713-1784), best known as the co-founder and chief editor of the first encyclopedia, aptly referred to as the Encyclopédie. He earned his degree in philosophy at the University of Paris, and managed to offend just about everyone in the France of his day (beginning with the Catholic Church and government authorities) for his dogged, uncompromising pursuit of truth through evidence and science.
In his Introduction to the Encyclopédie, Diderot wrote: “All things must be examined, debated, and investigated without exception, and without regard for anyone’s feelings”. Ouch! Not a popular sentiment in this rapidly deteriorating, postmodern culture, where “feelings” are King, and “lived experience” valued as highly, if not more highly, than empirically discovered, tested and reasoned knowledge in a growing number of domains, including, most surprisingly, in academia, journalism, and law.
 
Yet it’s been said, and with much justification, that “facts don’t care about your feelings”. And they don’t; nor should they. Facts are facts. They are our connection to the real world as it is. This is not to say that people shouldn’t care about feelings. Of course they should. Feelings matter. The feelings of others matter (no less than our own). We ought never run roughshod over them. At the same time, we ought never allow anyone’s feelings (including our own) to thwart, sabotage, or determine our search for, and interpretation of, the facts, or to otherwise hijack our search for the truth.
 
Critical thinking enables us to cut through the noise of emotion-driven narratives. It enables us to follow the evidence wherever it leads, even if the upshot of that search produces facts or outcomes that hurt people’s feelings. But this should never be the object. It serves no one to trivialize the reality or depth of another’s pain, fear, heartbreak or brokenness. Empathy, compassion and love are the tools we are given to bring healing and wholeness to the pain, loss and brokenness of family, friends, even strangers and enemies. Yet once the feelings have been acknowledged, and human agency has intervened to bring comfort and support, the facts that produced the feelings must still be reckoned with. Here is where the tools of critical thinking find their place. It is in this space, where reason and empirical analysis are allowed to work, that truth can be found, and healing can begin.
 
This series is only a high-level overview of what critical thinking is. It’s not a workshop on how to master the skills of critical thinking.
 

34. The "Key Attributes and Conditions of Critical Thinking"

This is the 34th and *final* installment of my “Crash Course in Critical Thinking”. In it I distill the key attributes and conditions of critical thinking.
 
Thank you to those who came along for the ride and supported me through the series. I hope you found it useful, and that it at least piqued your curiosity to know more about the fading art and science of critical thinking.
 
Learn critical thinking; and learn to apply it. If you do, no one will ever be able to bamboozle, gaslight, or seduce you again with unexamined political or ideological mantras. It will give you confidence to stand your ground under nearly any condition. It’ll teach you the importance of never accepting anything at face value unless you know the source to be reliable (and even then only with a healthy dash of salt). It will teach you that rapidly disappearing art, one that stands today on a precipice of extinction: Thinking. Thinking rationally, slowly, and critically.
 
This series has been a high-level overview of what critical thinking is. It has not been a workshop on how to master the skills of critical thinking. There are several valuable resources available for going deeper. Where to start?
 
Here is a helpful, low-cost introduction to critical thinking for anyone interested:
 
Wishing you well in your journey.
 
Robaire