09 important takeaways from this lecture:
00:00:40 Why is it that smart people can be so stupid? Being competent in one field of study in no way guarantees being competent in another field of study.
00:02:38 The Geller Effect is the situation in which a person, not knowing beforehand that an event is about to occur, will prescribe many ‘explanations,’ often times completely unrelated, to justify the event after the fact.
00:17:30 When you think about any kind of claim, the most important issue to think about is the quality of the information and supporting data behind that claim. Garbage in; garbage in. It doesn’t matter how advanced your critical thinking skills are, if your original information is wrong, any solutions you arrive at will be flawed. Therefore it’s imperative to make sure you’re starting with good data.
Good data is very, very hard to come across.
00:21:12 People tend not to be very good at collecting and describing data; especially if they are asked to do it on the spot and do not know in advance the ultimate reason why the data will be used. This means that people will focus on a very wide range of data and how they make sense of what they saw. How can you accurately destinguish relevant data from the irrelevant?
The Geller Effect shows the importance of scientific observation.
00:33:09 Scientific observation involves defining hypotheses before conducting studies and applying logic which you will guide your research and postulations. Everything you do in science is probabilitic, yet the idea of probability has only been around for about 400 years.
00:34:00 Contaminated mindware is the term for the presence of incorrect or missing information. The framework for critical thinking is to remove all contaminated mindware.
To avoid information pollution and get trustworthy data, you have to plan your observations. Once you’ve correctly planned it you can focus on what it is you’re looking for. If you don’t plan it you don’t know what to focus on, and so attribute answers and explanations to things which may have actually had no bearing on the end result.
00:38:40 “(Humans) have well-developed visual and other perceptual systems, complex motor skills and the ability to learn in many ways to adapt to the environment around us… We have a language system that is complex and sophisticated in its ability both to represent knowledge and to communicate with other humans; we study and attempt to understand a multitude of subjects including our own history and that of the universe; we have devised a system of mathematics and logic; we design and build a huge range of structures and artifacts; we have constructed adn mostly live our lives within highly complex economic and social structures. All of these things imply an extraordiary ability to reason, entertain hypotheses and make decisions based upon complex mental simulations of future possibilities…
BUT, PARADOXICALLY, “…it is equally apparent that evidence of humans error and fallibility surrounds us. The world is plagued by wars, famines and diseases that in many cases appear preventable. Stock markets collapse under panic selling when each individual acts to bring about the outcome that none of them wants. Doctors sometimes make disastrous misjudgments that result in the disability or death of their patients. Experts often fail to agree with each other and may be shown in hindsight to have made judgments that were both mistaken and overconfident. At the present time governments of the world are well informed about the likely progress of global warming and its consequences but seem to be making minimal progress in doing anything to prevent it. Criminal courts continue to convict the innocent and acquit the guilty, with alarming regularity…” – Excerpt by Jonathan Evans from Hypothetical Thinking: Dual Processes in Reasoning and Judgment (Essays in Cognitive Psychology)
00:45:00 Satisficing is the idea that, because of limited cognitive capacity, rather than trying to maximize our evidence for a decision, we do the best we can with a limited amount of data and we make decisions based on the parts of the data currently availble to us. We thus become cognitive misers (def’n: a stingy, avaricious person who lives in wretched circumstances in order to save and hoard money.)
Heuristics and cognitive biases are mental shortcuts humans make because we cannot always cognitively handle all of the information needed to make a decision, and neither do we have the time necessary to do so. They aren’t necessarily bad, and in fact are usually very helpful; serving as a platform to automate and aid in our reasoning.
00:47:00 Expertise involves automatic thinking; the autonomous mind where you no longer have to consciously think about what you’re doing, you just do it.
Many experts don’t realize that the very intuitions which make them experts in their particular field are not relevent once they step outside of their field of expertise.
2 réponses à “100! Critical Thinking: Keys To Critical Thinking & Thinking About Dubious Claims”
[…] inspirations, the more ‘original’ your ideas. As James Randi explains in his lecture Critical Thinking: Keys to Critical Thinking & Thinking About Dubious Claims, “Garbage in, garbage […]
[…] his lecture Critical Thinking: Keys To Critical Thinking & Thinking About Dubious Claims, Ray Hymen argues that when you think about any kind of claim, the most important issue to think […]