<strong>Joshua SMITH</strong>
Joshua SMITH

Executive Trainer & Edtech Co-founder @ Coursely.eu. Head of Higher Education Partnerships & Adjunct Teacher Recruiting in France.


120. Critical Thinking: How & When To Override The Autonomous Mind

04 important lessons from this lecture:

00:00:39 Information pollution in the internet age is that, with freedom of speech combined with the ability to instantly publish anything you want immediately and without verification, as information accumulates in any field of study, how can you survive an environment with so much contaminated information?

[EDITOR’S NOTE: In his extremely informative talk Copy, Cut, Paste: How Eveything Is A Copy From A Copy From A Copy…, Andy Baio details out how the internet’s instant publishing is rubbing up against the copyright infringement world and is, sometimes unjustly, ruining the lives and future of entrepreneurs.]

You overcome cognitive biases, mind gaps, contaminated information and heuristics by applying hypothetical thinking.

High-quality information in a particular subject tends to grow in a linear fashion, while the total amount of information on that same particular subject tends to grow as a cube, meaning as more and more information accumulates on that particular subject, it becomes harder and harder to rummage through the useless, polluted information in order to locate the high-quality information because it’s unavoidably mixed in with crap. This will only get worse.

Recall from the lecture Keys To Critical Thinking & Thinking About Dubious Claims where “regardless of how good your critical thinking framework is; garbage in is garbage out,” if we are constantly being innundated with bad information, how can you protect yourself by both finding and identifying the good information from the contaminated?

The illusion of truth principle argues that just merely being exposure to conaminated information and statements, such as ‘Global warming is a hoax,’ even though you don’t believe this, it does increase your belief that it is so, or at least the possibility that the good information may in fact actually be contaminated information.

[EDITOR’S NOTE: For more information on the origins of global warming, refer to the mini-documentary The Story Of Stuff: How Our Modern Markets Economy Is  Destroying Our Planet by Annie Leonard.]

This is because of the way your memory works: memory doesn’t work like a list of facts and dates – it’s a network of information. And things stimulated in a certain network explode and have a ripple effect on other related networks in your brain’s associations. So it turns out that simple repetition is strong enough to cast doubt on even the most certain of facts.

Fox News, for example, spends so much more time demonizing their ‘enemies’ rather than promoting their own ideals and policies – mud slinging and negative campaigning – that it’s no wonder that anyone who spends half ot their day listening to them will be persuaded to believe what they say. This is the danger of the illusion of truth and mere repetition.

[EDITOR’S NOTE: For more interesting information on how advertising creatives use the idea of exploding networks and the ripple effect in their brain to create advertising campaigns, browse through my interviews with several notable creative directors such as Rémi Noel, Eric Holden, Steven BrinleeRory Sutherland, Andrei Robu and Gregory Ferembach.

00:06:28 The principle of charity is the idea that when you are attacking a claim or an opponent, etc., don’t attack your opponent’s argument at it’s worst; attack his or her argument at it’s best. Aim to reformulate your opponent’s claim in the strongest way possible before you to address it, attack it, or destroy it: Give your opponent the benefit of the doubt by reframing his or her argument in the best, most logical and correct-possible light before addressing it. Doing this not only increases the credibility of your reputation for objectiveness and fairness, it also take away the from the strength of any rebuttle your opponent may have to your logical argument.

If you’re going to criticize; be fair. Don’t attack a person’s argument at it’s worst; reformulate it in the best, most strongest way possible and then address it.

[EDITOR’S NOTE: For more information on how to handle opponents and deal with opposition, read the books:

00:16:30 Recall in the lecture Perceptual & Cognitive Biases – Fast & Slow Thinking, Karl Popper pointed out that one of the biggest weaknesses of all of us is that we always focus on what did happen and not on what didn’t happen; sometimes it’s what didn’t happen that is the most important part to think about. Meaning, just because your prediction came true doesn’t really show you much until you understand what the alternatives would have been.

00:18:51 Karl Popper’s framework for systematic analysis summed up in 6 questions are:

  1. What is the issue or question?
  2. What is the claim? [phrased in conditional form*]
  3. What reasons are offered to support the claim?
  4. How strong is the support?
  5. What would be adequate support?
  6. What reasons might create (false) beliefs in the claim?

The conditional format:

T: If (H & IC & AC), then (P)

Where:

  • T: theory (a description of a hypothetical system)
  • H: hypothesis, the claim that the theory (T) is true
  • IC: initial conditions for evaluating the claim
  • AC: auxiliary conditions that must hold true for the claimed outcome to occur
  • P: Predicted outcome given that the hypothesis is true and the initial and auxiliary conditions are met

Applying the above conditional format to his original example in his lecture on Keys To Critical Thinking & Thinking About Dubious Claims question:

Can a key be bent without physical force by an unknown psychic power?

  • T: metal can be bent by unknown psychic powers
  • H: This theory (T) is true of some individuals
  • IC: the unbent key
  • AC: the alleged psychic stokes the unbent key and wants the key to bend.  The physicial stroking is unsufficient to bend the key by psychic force, however the psychic power is enough.
  • P: The key will be bent
  • Proof offered to support the claim for (T): A bent key was displayed to the observers.
  • Ideal, ‘Adequate’ Proof: Had there been clear evidence that the key had not been bent before the demonstration, and could not have been bent by physical force during the demonstration?
  • ‘Inadequate’ Support: Had it been possible for the ‘psychic’ to bend it during that time while the marked, unbent key was in his possession? The bent key had been marked beforehand to preclude switching. However the key had been out of sight and in the demonstrator’s possession before it was apparently bent. This depends on knowing about the principle of leverage and realizing that the demonstrator had another key in his hand at the same time he also had possession of the key that had been known to be bent. Observers were asked to touch the key to confirm that key was a valid key, and to touch the key in a way which would not physically bend the key, which would have disqualifying the marked, unbent key from being used as proof during the demonstration.

[EDITOR’S NOTE: Three very good books I have personally read and would highly recommend are:

  1. Critical Thinking: A beginner’s guide by Sharon M. Kaye
  2. Do They Think You’re Stupid: 100 ways of spotting spin & nonsense from the media, pundits & politicans by julian Baggini
  3. The Full Facts Book of Coldreading by Ian Rowland]

4 responses to “120. Critical Thinking: How & When To Override The Autonomous Mind”

  1. […] Brad Frost’s lecture Death to Bullshit that 90% of everything is crap and Ray Hyman’s lecture How & When to Override The Autonomous Mind that information pollution in the internet age is that, with freedom of speech combined with the […]

  2. […] NOTE: Recall in Karl Popper‘s lecture How & When to Override The Autonomous Mind that the principle of charity is the idea that when you are attacking a claim or an opponent, etc., […]