Ray Bradbury is one of the most iconic science fiction
We can be fooled or manipulated by propaganda and pseudo-science, so Sagan’s kit is a brilliantly concise hit-list of how to test ideas and distinguish truth from lies. He lists the top 20 thinking traps to help spot the most common and dangerous fallacies. He breaks it down like this:
#1 Ad hominem – to attack the argument-maker and not the argument, e.g. “the Reverend Dr. Smith is a known Biblical fundamentalist, so her objections to evolution need not be taken seriously.”
#2 Argument from authority – to claim accuracy solely on the authority of the person making the claim – e.g. “President Richard Nixon should be re-elected because he has a secret plan to end the war in South-east Asia.” As it is secret the argument cannot be tested and so you must trust him, just because he is the President!
#3 Bad consequences – to argue the result of a decision, e.g. “the defendant in a murder trial must be found guilty; otherwise it will encourage men to murder their wives.”
#4 Appeal to ignorance – logically wrong. The absence of evidence is not evidence of absence.
#5 Special pleading – often used to rescue a proposed idea that is becoming weak.
#6 Assuming an answer – e.g. “we must institute the death penalty to discourage violent crime.” But does the violent crime rate in fact fall when the death penalty is imposed?
#7 Observational selection – e.g. a state boasts of the Presidents it has produced, but is silent on its serial killers.
#8 Statistics of small numbers – drawing conclusions from inadequate sample sizes.
#9 Misunderstanding how statistics work – this can lead to some strange interpretations, e.g. some might be alarmed to read that half of all Americans have below average intelligence!
#10 Inconsistency – e.g. military expenditure is often based on worst case scenarios but requests for funding for other matters are ignored because they are not yet ‘proven’.
#11 Non sequitur – this means that the argument does not logically follow; the person making the argument has failed to see other possibilities or explanations.
#12 Confusing cause and effect – mistaking the fact that something happened after something else for being a cause, e.g. “before women got the vote, there were no nuclear weapons.”)
#13 Meaningless questions – e.g. “what happens when an irresistible force meets an immovable object?”
#14 Excluded middle – considering only the two extremes to make the other side look worse than it really is, e.g. “if you’re not part of the solution, you’re part of the problem.”
#15 Short-term v. long-term – e.g. “why pursue science when we have so huge an education budget deficit?”
#16 Slippery slope – unwarranted extrapolation of the effects, e.g. “give an inch and they will take a mile.”
#17 Confusion of correlation and causation – e.g. “Andean earthquakes are correlated to approaches of Uranus, therefore the latter causes the former.”
#18 Straw Man – stereotyping to make it easier to attack, e.g. “environmentalists care more for snail darters and spotted owls than they do for people.”
#19 Suppressed evidence or half-truths – e.g. a prophecy of an assassination attempt is shown on television; but was it recorded before or after the event?
#20 Weasel words – for instance when euphemisms for war are employed to make it sound more palatable. Sagan quotes Talleyrand: “an important art of politicians is to find new names for institutions which under old names have become odious to the public.”
Sagan concludes we must of course always apply the same critical thinking to our own arguments:
“The truth may be puzzling. It may take some work to grapple with. It may be counter-intuitive. It may contradict deeply held prejudices. It may not be consonant with what we desperately want to be true. But our preferences do not determine what’s true.”