Logic Tutorial, Part 3.
A List of Logical Fallacies

(c) 2017 by Barton Paul Levenson



One good way to get a feel for correct and incorrect use of the syllogism is to cite the major "logical fallacies"--various types of incorrect argument. I'm indebted to an economics textbook (Bach, 1966) for introducing me to this subject, and to a web site (Downes, 1996) for the classification scheme used here. Roughly, that is--I do list things a bit differently from Downes. For instance, I don't include "fallacy of hasty generalization" as a separate fallacy, since I think that in practice it is the same as "fallacy of generalizing from too small a sample." Downes lists 53 logical fallacies, and his is not the only way to list them. The number of fallacies cited in various texts ranges from a mere 17 in Copi and Cohen (1998) to 112+ in Fischer (1979).

To save space, I use some special terms:





I list no fewer than 52 types of logical error below, arranged in 13 different groups. This is a lot of logical fallacies to learn all at once. The list covers 53 double-spaced pages in MS Word. Treat this page as a reference book, a catalog. Come here when you run into a term like "ad hominem argument" or "straw man argument." If you like this kind of thing, read through it; if you don't, read the rest of the web site and only refer to this section when needed.

Note: I cite a lot of political arguments here, as they provide a wealth of easy examples. However, you cannot assume I agree or disagree with any of the positions taken. I am merely citing example of bad arguments, whether on "my side" or "the other side." I try to balance it out and cite both.





Click below to go to any particular logical fallacy.



FALLACIES OF DISTRACTION
False dilemma (illegitimate use of "or")
Argument from ignorance ("argumentam ad ignorantiam")
Slippery slope
Complex question (illegitimate use of "and")

APPEALS TO MOTIVE
Appeal to force ("argumentum ad baculum")
Appeal to pity ("argumentum ad misericordiam")
Appeal to consequences ("argumentum ad consequentiam")
Prejudicial language
Appeal to popularity ("argumentum ad populum")

CHANGING THE SUBJECT
Attacking the person ("argumentum ad hominem")
Fallacious appeal to authority ("argumentum ad verecundiam")
Anonymous authority
Style over substance

INDUCTIVE FALLACIES
Unrepresentative Sample (sampling error, generalizing from too small a sample)
False Analogy
Slothful Induction
Exclusion

STATISTICAL FALLACIES
Accident (Missing an exception, "dicto simpliciter ad dictum secundum quid")
Converse Accident (Using an exception where one is not justified)

CAUSAL FALLACIES
Post Hoc, Ergo Propter Hoc ("after this, therefore because of it")
Joint Effect
Insignificant Effect
Wrong Direction
Complex Cause

MISSING THE POINT
Begging the Question (arguing in a circle, circular argument, tautology, "argumentum in circulando," "petitio principii")
Irrelevant Conclusion ("ignoratio elenchi")
Straw man

FALLACIES OF AMBIGUITY
Equivocation
Amphibole
Accent

CATEGORY FALLACIES
Composition
Division

NON SEQUITURS
Affirming the Consequent
Denying the Antecedent
Inconsistency

SYLLOGISTIC ERRORS
Four Terms ("quaternio terminorum")
Undistributed Middle
Illicit Major (illicit process of the major term)
Illicit Minor (illicit process of the minor term)
Exclusive Premises
Drawing an Affirmative Conclusion from a Negative Premise
Existential Fallacy

FALLACIES OF EXPLANATION
Subverted Support
Non-Support
Untestability
Limited Scope
Limited Depth

FALLACIES OF DEFINITION
Too Broad
Too Narrow
Failure to Elucidate
Circular Definition
Conflicting Conditions




FALLACIES OF DISTRACTION

These are fallacies which bring up an irrelevant issue to conceal a poor argument. The actual point is never discussed.



1. False dilemma (illegitimate use of "or"). The arguer fails to admit alternatives. When the number of choices given is two, this is called "the fallacy of bifurcation" or "the unnecessary either-or choice."

Example: If you get rid of nuclear power, you're going to wind up with coal-fired power plants stinking up the air and increasing the greenhouse effect.

Translation:
M: Nuclear and coal are our only energy alternatives.
m: If you don't want nuclear,
C: You'll be stuck with coal.

Counter: Show, with an example, that other options also exist. In any logical argument you suspect to be invalid, watch for unstated premises. An argument can lead to a false conclusion if the unstated major premise is false. In real life, if someone made the argument above, you could make things clearer by asking, "Aren't you assuming it can only be one or the other?" In this case, possible alternatives might be fusion, oil, natural gas, biomass, wind, solar, geothermal, ocean thermal, tidal, wave, animal muscle power, human muscle power, or doing without. Many of these might be harder to live with than coal or nuclear, but they are alternatives.

Note: If the options really are limited to those given in the argument, the major premise is true and the argument is not a fallacy. The fact that false dilemmas exist does not mean there are no real dilemmas.



2. Argument from ignorance ("argumentam ad ignorantiam"). Assuming that because something hasn't been proven false, it must be true.

Example: You can't prove UFOs aren't flying saucers. That's what they are!

Translation:
M: If you can't prove UFOs aren't flying saucers, they must be flying saucers.
m: You can't prove UFOs aren't flying saucers.
C: They are!

Counter: Show that the (unstated!) major premise is false. Point out that other alternatives exist.

Note: It is true that something which hasn't been proved not to exist may exist. As Carl Sagan said, "absence of evidence is not evidence of absence."



3. Slippery slope. Saying without proof that one thing alleged to be bad will lead to other bad things. I will give both conservative and liberal examples, because I don't want to alienate readers of either stance. This is not a "liberal" book or a "conservative" book. Never stop reading something solely because you don't share the author's views. You might miss something important.

Example: If we allow abortion, it's only a matter of time before infanticide is allowed as well.

Translation:
M: Legalizing abortion leads to legalizing infanticide.
m: If we allow abortion,
C: it's only a matter of time before infanticide is allowed as well.

Example: If we allow capital punishment for murder, it won't be long before we have it for jaywalking, too.

Translation:
M: Allowing capital punishment for one crime will lead to having it for other crimes.
m: If we allow capital punishment for murder, it won't be long before we have it for jaywalking, too.

Counter: Show that there is no reason to assume the connection the arguer assumes.

The (unstated!) major premises in each of the cases above can be rephrased to a more general, "allowing a policy I think is bad will lead to a policy you agree is bad." Of course, if the arguer can show such a connection, this may not be a fallacy.



4. Complex question (illegitimate use of "and"). Linking two or more different questions and forcing the reader to accept one if he accepts the other.

Example: If you support freedom and the right to bear arms, you ought to oppose so-called "gun control."

Translation:
M: If you support freedom and the right to bear arms, you ought to oppose so-called "gun control."
m: You say you support freedom.
C: You ought to be against gun control.

Counter: Show that two (or more) separate questions are involved.

Note the incomplete minor premise here. Someone may define freedom in such a way as to exclude the right to bear arms. Whether the reader "supports freedom" and "supports the right to bear arms" are separate questions. It may be that the right to bear arms is logically included in support for freedom, but this argument does nothing to prove that.

Note: If it can be shown that the two things really should go together, the argument may be valid.



APPEALS TO MOTIVE



5. Appeal to force ("argumentum ad baculum"). The reader is threatened to make him or her agree.

Example: If you oppose Marxist Leninism, you will be liquidated when the revolution comes.

Translation:
M: Those who oppose Marxist Leninism will be liquidated when the revolution comes.
m: If you oppose Marxist Leninism,
C: you will be liquidated when the revolution comes.

Counter: Point out that threatening someone into agreement doesn't make the conclusion true. "A man convinced against his will is of the same opinion still." Notice that in the example, the validity of the Marxist Leninist analysis is not proved or even mentioned.

Note: Despite its total logical irrelevance, a threat may have great practical importance. There are times when it may be a good tactic to accede to force even though your real opinion of the subject stays the same. If someone points a loaded gun at you and says, "The FBI is controlling my mind with radio signals!" it may not be wise to accuse them of having delusions.



6. Appeal to pity ("argumentum ad misericordiam"). Trying to engage the reader's sympathy instead of trying to prove the point under discussion.

Example: Like Galileo, Velikovsky has been derided, persecuted and vilified by the scientific establishment.

Translation:
M: People who have been persecuted are right.
m: Velikovsky was persecuted.
C: Velikovsky is right.

Counter: Point out that the arguer being pitiable has nothing to do with whether the point is true or false. In this case, whether Velikovsky is right or not must be decided on the basis of how well the evidence supports his theories. Being persecuted is not enough; Galileo was persecuted, but so were people whose theories turned out to be wrong (e.g. Lamarck).

Note: The pitiable state of the arguer is relevant if that's what the argument is about.



7. Appeal to consequences ("argumentum ad consequentiam"). Saying an argument is false because if it were true it would be terrible.

Example: We must be able to travel faster than light, because if we can't, it'll take years to travel between even the closest stars, and the Universe will be a cold and lonely place.

Translation:
M: Natural laws which depress us can't be true.
m: The speed-of-light limit is very depressing for us space minded people.
C: It's not true!

Counter: Point out that the Universe doesn't have to conform to our wishes. (Note the unstated major premise.)



8. Prejudicial language. Slanting the presentation to engage the reader's sympathy.

Example: The President came out with another fascist tax proposal today.

Translation:
M: We who hate fascism will not support fascist policies.
m: The President proposed a fascist policy today.
C: We shouldn't support it.

Counter: Point out that the statement could have been worded differently, and that the language actually used is prejudicial.

In trying to convince people who disagree on the point in question, these arguments are usually counterproductive. But they are often used to "play to the audience" in formal debates. This can be a real asset in, say, a court of law.



9. Appeal to popularity ("argumentum ad populum"). Assuming something is true because most people think so.

Example: Everybody knows capital punishment is a deterrent.

Translation:
M: Things everybody knows are true really are true.
m: Everybody knows capital punishment is a deterrent.
C: It must really be one.

Counter: Point out that majority vote can't change the truth.

Actually looking at the evidence on controversial questions often gives surprising results. It seems very logical that capital punishment should deter crime--most people would avoid doing something if they knew they'd be killed for doing it! But in order to prove it you'd have to do some kind of test, like a statistical analysis.

Note: There may be other reasons to support capital punishment whether it deters or not.



CHANGING THE SUBJECT



10. Attacking the person ("argumentum ad hominem"). Saying an arguer should be disbelieved because of the sort of person he or she is.

Example: You can't believe anything Sharon says. She's got a long history of stealing things and lying about it.

Translation:
M: Anyone with a long history of stealing things and lying about it can never be right.
m: Sharon has such a history.
C: You can't believe anything she says.

Example: There's no point in listening to Exxon Mobil's ads about how global warming isn't proven. They obviously don't want any laws to pass which could restrict the demand for oil.

Example: There's no point in listening to climatologists when they yammer about global warming. They obviously want their field to be a hot field so they can keep getting fat government grants.

Translation:
M: If someone wants something to be true, it can't actually be true.
m: Exxon Mobil wants global warming not to exist (or, Climatologists want global warming to exist).
C: They can't be right.

One last example gives the special case known as a "tu quoque."

Example: Doctor Greenleaf keeps telling me smoking is going to kill me, but I happened to notice he smokes three packs a day himself. So, excuse me very much, but I think I can ignore that particular bit of advice.

Translation:
M: People who don't practice what they preach can be safely ignored.
m: Doctor Greenleaf is like that.
C: I can safely ignore him.

Counter: Point out that in logical argument it is always the truth and logic of the argument that count and not the person doing the arguing. An awful person or a stupid person might still be right on the point in question.

Sharon's logic might be good and her premises true in the current argument even if she can't be trusted in some situations. Exxon Mobil, or climatologists, might be right even though they want their thesis to be true. A habitual liar who says the French Revolution was in 1789 is still right about the French Revolution.

Note how the last example (the chain-smoking Dr. Greenleaf) is, in a way, almost the opposite of the other two examples. In the first two, the arguer didn't believe the thesis because the people who stated it wanted to believe it. Here, the arguer doesn't believe it because the person stating the thesis acts like he doesn't believe it! But in all three cases, the motives or behavior of the people behind the ideas say nothing about whether the ideas are true or not.

In summary, even if someone has strong reasons for lying in a given situation, that does not, by itself, prove they are lying now.

I suspect that the vast majority of political arguments are ad hominem arguments. Remember this fallacy the next time you watch a campaign ad.



11. Fallacious appeal to authority ("argumentum ad verecundiam"). Quoting someone as an authority when their qualifications don't matter.

Example: Pat Robertson said there are nuclear missiles in Cuba.

Translation:
M: Things Pat Robertson says are true.
m: He says there are nuclear missiles in Cuba.
C: There are nuclear missiles in Cuba.

Counter: Point out that A) the person cited is not an authority on this particular subject, or that B) other experts disagree (the issue isn't settled), or that C) the authority cited didn't mean what he/she said--was being sarcastic, or is being quoted out of context, etc.

Pat Robertson may be an expert on Christian theology, or political science, or, for all I know, raising hothouse roses. But there is no evidence that he works in military intelligence or has a source of top-secret data. So the major premise is not necessarily true, or not always true.

Note: Even if the authority wasn't qualified to make that particular statement, the thing stated may turn out to be true anyway.



12. Anonymous authority. The actual identity of the authority is never given.

Example: Psychologists have proved that all human behavior is based on sex.

Translation:
M: Things psychologists say are true.
m: Psychologists say human behavior is based on sex.
C: Human behavior is based on sex.

Counter: Point out that the arguer hasn't cited a specific authority, and that vague appeals to vaguely defined authorities don't prove anything. For this example, "psychologists" in general don't speak with a unified voice. There was one school, the Freudian, that advanced this particular premise, but nowadays most psychologists don't take strict Freudianism very seriously. (One problem was that the way it was phrased made it untestable and therefore unverifiable [see fallacy #46 below]. It could be made to fit any evidence, and therefore had no useful logical content. As Piet Hein put it,

"Everything's either concave or 'vex,
So whatever you dreamed, it was something with sex."

Note: Even if the authority cited doesn't really count as an authority, the conclusion might be valid because of other evidence. Countering a fallacy like this only removes a false issue; it doesn't settle the question.



13. Style over substance. The way an argument is phrased is held to affect its validity.

Example: "John was saying Fords are unsafe, but I saw the page about it on the Ford website--they laid out the case pretty persuasively, with pictures and everything. John's just wrong."

Translation:

M: A persuasive argument, laid out with pictures and everything, must be right.
m: Ford so argues that its cars are safe.
C: Its cars must be safe.

Counter: Point out that the style of the argument doesn't affect its validity; it may be false even though well-presented, or true even though amateurishly or abusively presented. Note, though, that even though a persuasive style and good graphics don't prove your case, it's still better to make your argument persuasive and to use graphics if graphics will help. Note the ancient and medieval teaching about rhetoric as opposed to logic. Your argument must be logical to convince your reader's intellect. But it must also be persuasive to capture your reader's feelings. A person who knows the right thing to do might still be apathetic about doing it.



INDUCTIVE FALLACIES

These are fallacies which incorrectly go from specifics to general rules.



14. Unrepresentative Sample (sampling error, generalizing from too small a sample). The sample doesn't accurately represent the population under discussion; it is too small to prove what is claimed. This is one of the most common fallacies.

Example: "Oh, don't go in there! I bought a fruit pie on sale there once and it was moldy! All their stuff is moldy!"

Translation:
M: All food from stores which have sold a moldy item is moldy.
m: I bought a fruit pie on sale there one day, and it was moldy!
C: All their stuff is moldy!

Counter: Show that something happening once, or even several times, isn't usually enough to prove a general principle. In this case, note the (unspoken) false major premise. The argument might have been true if the major premise had been, "All stores that allow moldy food to be sold are careless with food quality and therefore potentially dangerous," and the conclusion, "They might be careless about food safety in there." That "All" in the first premise shows the potential overgeneralization.

There are much more serious examples of sampling error millions of people believe:

"The guy who mugged me was black. It's always some black guy, isn't it?"

"Goddam Jew cheated me. They're all cheats."

"Guy looked me straight in the eye and said, 'We don't rent to your sort of people.' White folks are just out to get black folks, that's all there is to it."



15. False Analogy. The things being compared aren't really similar.

Example: Workers are like nails. You have to hit them in the head, hard, to make them work.

Translation:
M: Things that are like nails have to be hit in the head, hard, to get them to work.
m: Workers are like nails.
C: You have to hit workers in the head, hard, to get them to work.

Counter: Point out that the cases aren't really parallel.

For this example, billions of workers around the world seem to be able to perform their jobs without being smacked over the head, so clearly the analogy here is a poor one. Those who believe this kind of thing don't usually put it so clearly. Tragically, though, millions of employers, pimps and slave-owners around the world believe just this.

Note: Analogies aren't always false analogies. Analogies can be good or bad. Consider Forrest Gump's "Life is like a box of chocolates--you never know what you're going to get." This is a good analogy in that events are just as unpredictable as the centers of assorted chocolates--assuming the box doesn't have the location of each type marked inside the box top. It's a bad analogy because not everything we get from life is sweet, or satisfying.

The way to determine whether an analogy is good or bad is to think it through and see whether the cases are parallel or not. For example--"If you don't let out your anger it will build up until you explode, just like a balloon." This was considered good pop psychology for a long time. But some people--for example, monks and nuns, whether Christian or Buddhist--spend years learning to suppress feelings they consider negative. Are monks and nuns more likely than others to suddenly explode in anger, machine-gunning dozens from the top of an office building? I don't mean the nun who was always whacking your fingers in parochial school; I mean a full-time contemplative.



16. Slothful Induction. A conclusion is denied when the evidence obviously points to it.

Example: "I know I've had ten car crashes in the last three months. I tell you, it's not my fault!"

Translation:
M: A lot of crashes where the only common factor is that I'm driving don't prove that my driving is bad.
m: I've had a lot of crashes.
C: That doesn't prove my driving is bad.

Counter: Point out that the evidence is pretty overwhelming that the person is wrong.

As with the post hoc ergo propter hoc fallacy (#21 below), the arguer's objection is technically true. It might just be that he has had terribly bad luck, or that enemy spies are targeting him to make him lose his license. It's just very unlikely.



17. Exclusion. The argument failed to note evidence which would change the outcome.

Example: The Steelers won five out of the last six games. They're likely to win this one.

Translation:
M: If a team has won most of its last few games, it will probably win the next one.
m: The Steelers did just that.
C: They're likely to win this game.

Counter: Point out that the arguer is leaving out a crucial fact. Here the crucial fact left out might be that the last five games were against crummy, chronically losing teams, on Steeler home turf, and that the upcoming game is a game away from home against a team which won 20 out of the last 21 games.

Note: If the cases really are parallel, this fallacy doesn't apply. Make sure the objection you're raising is really relevant.



STATISTICAL FALLACIES

These are fallacies which incorrectly go from specifics to general rules.



18. Accident (Missing an exception, "dicto simpliciter ad dictum secundum quid"). A generalization is used when evidence implies there should be an exception.

Example: "Every bird I've ever seen could fly, so this one should darn well be able to."

Translation:
M: All the birds I've ever seen could fly.
m: This is a bird.
C: It should be able to fly.

Counter: Point out that there are exceptions. The major premise isn't really relevant to what the arguer wants to prove, so his conclusion is a non sequitur. Ostriches and emus are birds, but they can't fly.

Note: If the arguer really has seen every relevant case, his argument may not be fallacious.



19. Converse Accident (Using an exception where one is not justified). An exception is claimed without justification.

Example: You let Bob take a make-up test, just because he missed the final the day his roof caved in. You should let everyone take a make-up test.

Translation:
M: If you allow something in one case, you should allow it in all cases.
m: You let Bob take a make-up test when he missed the final, just because he was buried under tons of plaster and wood that day.
C: You should let everyone take a make-up test.

Counter: Show that this case really is an exception, and can't be extended to all the cases.

This fallacy is similar to #15 (generalizing from too small a sample), but because it's a specific enough type of doing this, I thought it rated its own entry.



CAUSAL FALLACIES

These are fallacies which mix up cause and effect.



20. Post Hoc, Ergo Propter Hoc ("after this, therefore because of it"). Because something happened after something else, the something else is assumed to have caused it.

Example: "I saw Jimmy go into Susan's office; and when she came back later, her purse was gone. He must have stolen it."

Translation:
M: An event, A, which happens just before another event, B, must have caused event B.
m: I saw Jimmy go into Susan's office; and when she came back later, her purse was gone.
C: Jimmy stole Susan's purse.

Counter: Point out that just because one event followed another, it doesn't mean the first event caused it.

Obviously Jimmy might or might not have stolen Susan's purse. His going into her office before she found that her purse was missing only means that he could have stolen it, not that he definitely did. Someone else might have stolen it before Jimmy was there, or Susan might have left it in the women's rest room.

Mark Twain gave a more obvious example of this fallacy. "I fought with the Confederate Army for two weeks. Then I deserted. The Confederacy fell."

Note: One coincidence of this sort proves nothing; but if the same events keep happening in the same order, it strengthens the impression that there could be a causal connection. Economic forecasting is pretty much based on this sort of analysis--if you increase the size of the money supply on forty separate occasions and inflation follows each occasion in proportion, the increase in money could be causing the rising prices.



21. Joint Effect. One thing is held to cause another when something else really causes both.

Example: "We're experiencing this high inflation because exchange rates for the dollar have been rising so fast."

Translation:
M: Rising exchange rates cause inflation.
m: Exchange rates are rising.
C: That's why we're having inflation.

Counter: Point out that the thing held to be a cause is actually an effect.

This is a specific type of fallacy #21, post hoc ergo propter hoc, defined not only by a certain time sequence but by a specific set of circumstances: two things caused by a third thing, where one of the two effects is mistakenly held to be the cause. In this particular case, the government inflating the money supply may cause both the rising exchange rates and the price inflation. That's the typical order of these three phenomena--first the money supply changes, then exchange rates, then price levels.



22. Insignificant Effect. Something really does cause the thing under discussion, but its effect is trivial compared to the effect of something else.

Example: "Last night the house next door was on fire. I fired my water pistol at it six times, and a couple of hours later it was out."

Translation:
M: Water puts out fires.
m: I fired my water pistol into the fire next door.
C: That was what extinguished the fire.

Counter: Point out that most of the effect is due to something else. The kid's water pistol may well have put out some tiny spot of the fire, but most of it was probably due to the major dousing from fire hoses when the firefighters arrived.



23. Wrong Direction. Cause and effect are reversed.

Example: "I have looked at three such cases with my new microscope, and each sick person's saliva contained germs--plainly germs are caused by disease."

Translation:
M: A thing seen in cases with a similar factor must be caused by that factor.
m: Each examination I've made of sick people found germs.
C: Sickness causes germs.

Counter: Show that the person has confused the cause and the effect.

Note: There is a phenomenon called "secondary infection," in which a person's weakened state allows germs to flourish that the body's defenses might otherwise have killed. But in the vast majority of cases, it's the germs that cause the illness, not the other way around.

This very fallacy recently cost many lives. Some people argued that HIV infection was an effect, not a cause, of AIDS. The government of South Africa went easy on efforts to control HIV for a while because of this reasoning, though they have since changed their policy.



24. Complex Cause. Something is held to be the sole cause of an effect when in reality it is only part of the cause.

Example: "Look, I crashed the car because the road was wet."

Translation:
M: A wet road will inevitably cause a car crash.
m: The road was wet.
C: That's why the car crashed.

Counter: Show that other causes were also needed. Here, of course, the major premise is false. Maybe the road was wet, but 3,000 other cars traversed it safely, while in the case of the speaker, he was going 85 miles per hour, dead drunk. The driver's being bombed and driving too fast were also necessary for the crash to take place.



MISSING THE POINT



25. Begging the Question (arguing in a circle, circular argument, tautology, "argumentum in circulando," "petitio principii"). The desired conclusion is used as a premise of the argument.

Example: Of course Joe Bupkes is a famous scientist! A famous scientist wouldn't lie about something like that.

Translation:
M: Famous scientists never lie about who they are.
m: Joe says he's a famous scientist.
C: He can't be lying.

Counter: Show that the arguer is assuming what he's trying to prove. This example is presumably about whether or not Joe is a famous scientist. To use the point at issue ("Famous scientists don't lie") as one of the premises is to beg the question. Even if the statement is true and Joe really is a famous scientist, you can't prove it just because he says he is.



26. Irrelevant Conclusion ("ignoratio elenchi"). An argument said to prove one thing actually proves another.

Example: You have to support the new housing bill! You can't have people living in the street!

Translation:
M: Homelessness ought to be stopped.
m: The only way to stop it is with the new housing bill.
C: You should support the new housing bill.

Counter: Show that the arguer hasn't proved what he wanted to prove. Whether the major premise here is right or not--I think it is--the minor premise is clearly debatable.

Note: This example could have been phrased so as to illustrate "non sequitur" (see below) or "fallacy of bifurcation," but it also fits "irrelevant conclusion," so what the heck.



27. Straw man. The arguer is attacking something other than what you said, because that version is easier to defeat.

Example: "People don't like the draft because it inconveniences them. Well, excuse me, but there are more important things in life than convenience."

Translation:
M: We should not oppose national policies for trivial reasons.
m: People oppose the draft for a trivial reason.
C: We should not oppose the draft.

Counter: Point out that the arguer is not addressing the other side's actual argument. For this example, people might oppose the draft because they see it as a form of legalized slavery; an intolerable infringement of basic human rights. Or they might oppose it because they are philosophical pacifists and don't think there should be armed forces at all. But these stances can only be argued against honestly if they are addressed directly, rather than distracting the audience with a straw man argument.

Note: There might be good arguments for a draft, but of course the example above isn't one of them.



FALLACIES OF AMBIGUITY

These are fallacies which turn on the misuse of words.



28. Equivocation. Changing the meaning of a word or phrase in mid-argument to prove the arguer right.

Example: "You're supposed to fight for what you believe in. That's why I punched that guy when he said he was a Republican."

Translation:
M: We should fight for what we believe in.
m: The only way to fight is with sudden, irrational violence.
C: I was right to punch someone from a political party I dislike.

Counter: Point out that one term is being used to mean two different things. In this case, to "fight for" something is being narrowed down to mean a physical fight, whereas in popular discourse it means any determined course of action. A wounded man might be fighting hard for life. Dr. M.L. King, Jr. fought hard for civil rights for black people, but his methods were entirely pacifist.

Note: Be sure to agree on terms before the argument begins. This is why high school and college debaters so often demand that their opponents "define their terms." If both sides don't agree on what a key word means, they can argue in the dark for hours and never understand why they aren't communicating.



29. Amphibole. A sentence is used that can have more than one meaning.

Example: "Last night I shot an elephant in my pajamas. What it was doing in my pajamas I'll never know." -Groucho Marx.

Translation:
M: An adjectival phrase can apply to either the subject or the object of a sentence.
m: Last night I shot an elephant in my pajamas.
C: The elephant was wearing my pajamas.

Counter: Point out that the sentence is ambiguous and ask for clarification.

According to Herodotus, the Delphic Oracle told King Croesus that attacking a neighboring country would cause the fall of a mighty kingdom. Croesus started the war, but lost badly. The Oracle then said, in effect, "I didn't say which kingdom."



30. Accent. Using a statement or description that implies the opposite is usually true. This fallacy, originally described by Aristotle, applies much more often in Greek than in English, but it can show up in English, too.

Example. "King Charles walked and talked... half an hour after his head was cut off."

M: "After" implies the second clause happened later.
m: King Charles walked and talked... half an hour after his head was cut off.
C: The king had amazing stamina.

Discussion: "After" to mean "later" is an older English usage--British English as opposed to American English. The meaning of the sentence depends on how it is pronounced. The correct meaning can be seen if you add a single comma: "King Charles walked and talked... Half an hour after, his head was cut off."

Note: It is really, really hard to find examples of this fallacy in English. My wife, an English Lit major, remembered this one.



CATEGORY FALLACIES

These are fallacies which involve taking some bit of evidence to apply where it doesn't really apply.



31. Composition. A property of a part of something, or one element in a collection, is wrongly assumed to apply to the whole thing.

Example: I've seen photos of her apartment. The rooms are pretty big. They must be in a big building.

Translation:
M: Big apartments are only found in big apartment buildings.
m: The rooms in her apartment are big.
C: She must live in a big building.

Counter: Show that because the part is that way, the whole thing doesn't have to be that way.

Like many logical fallacies, this one involves jumping to a conclusion. In the example, the unstated major premise is false, thus yielding a false conclusion. Perhaps the apartments in the building are big, but perhaps the building only contains ten apartments, or four, or two. The building might be dwarfed by conventional high-rise apartment buildings on either side of it.

Notice that leaving out a premise can make a bad argument sound better--and watch for missing premises when you hear or read arguments.

Note: If the apartment is sufficiently huge, the building it is in would have to be fairly big just to accommodate that one apartment, even if there were only, say, four apartments in the building.



32. Division. A property of a collection or compound object is wrongly assumed to apply to each of its parts.

Example: What a tiny little restaurant! I'll bet they serve everything on tiny, cute little platters.

Translation:
M: Restaurants housed in small buildings serve things on small dishes.
m: This is a tiny little restaurant.
C: They serve everything on tiny, cute little platters.

Counter: Show that the properties of the large object don't necessarily show up in its parts. Here, the Texan owner of the little steakhouse might only have been able to afford this relatively small building--but he might serve huge cuts of steak on big serving platters. The building being small doesn't prove that the portions will be small.

Note: If the restaurant is a doll's house, then you can probably safely assume the dishes will be tiny to fit into it.



NON SEQUITURS

"Non sequitur" means "it does not follow" in Latin. Errors in this class all involve cases where the conclusion "doesn't follow" from the premises, meaning it has no relation to them.



33. Affirming the Consequent. "If A, then B. B, therefore A." This is the fallacy that says that if one thing causes an effect, the existence of that effect means the thing cited must have caused it.

Example: "The President said he'd lower unemployment. Unemployment is lower, so the President's plan obviously worked."

Translation:
M: Only the President's plan can lower unemployment.
m: Unemployment is lower.
C: The President's plan can be thanked for it.

Counter: Point out that other things might also cause the effect under discussion.

The fallacy in the example lies in the false major premise in the word Only. Perhaps the President has been in office only two months, and nothing in his program has passed Congress, but real economic growth picked up sharply because of something the last administration did two years ago.

Note: In those rare cases where the major premise is true, the fallacy disappears and the argument is valid.



34. Denying the Antecedent. If A, then B. Not A, therefore not B. This is the opposite error from #33 above.

Example: "If you are smothered with a pillow as a baby you'll die young. But you weren't smothered with a pillow when you were a baby, so you won't die young."

Translation:
M: Only smothering with a pillow as a baby can cause someone to die young.
m: You weren't.
C: You won't die young.

Counter: Show that the effect under discussion might still arise from another cause.

In the example, the fallacy is again the "Only" which makes the major premise false. In cases where the major premise is true the syllogism is valid. For instance, "Only if you are born to a Jewish mother, adopted by a Jewish family and raised Jewish, or convert later in life are you Jewish. You had none of those things happen, so you aren't Jewish." (Some Orthodox Jewish theorists would reject the adoption possibility, insisting on a definite conversion process including circumcision and so on.)



35. Inconsistency. One or more premises are mutually contradictory.

Example: Jack is taller than Bob, and Bob is taller than Jack.

Translation:
M: Jack is taller than Bob.
m: Bob is taller than Jack.
C: Each is taller than the other.

Counter: Show that both propositions can't be true. For this specific example, one could counter the arguer by explaining clearly what "taller" means. More generally, one must present a clear argument that the propositions contradict one another.



SYLLOGISTIC ERRORS



36. Four Terms ("quaternio terminorum").

Example: "All dogs are animals, and all cats are mammals, so all dogs are mammals."

Translation:
M: All dogs are animals.
m: All cats are mammals.
C: All dogs are mammals.

Counter: Point out that an unstated additional premise ("all dogs are cats") is needed for the argument to be valid. There are four terms under discussion here, dogs, cats, animals and mammals. A valid syllogism should have only three.

Downes notes that a special case of this fallacy happens when two of the terms under discussion are the same word. He gives the example, "Only men are born free, no women are men, so no women are born free." The four terms here are "men" (in the old sense in which it was used to stand for "humanity"), "women," "men" (in the modern sense of a particular sex) and "free."



37. Undistributed Middle. Two separate categories are said to be connected because they share a common property. The name comes from the fact that when an argument of this sort is written out as a syllogism, the minor premise fails to get the case in question included under the major premise.

Example: Men always like cheese! Fievel, my tiny friend, you like cheese. You must be a man!

Translation:
M: All men like cheese.
m: Fievel the mouse likes cheese.
C: Fievel the mouse is a man.

Counter: Show that restricting the quality in question to only the group cited is incorrect; other groups may have the same trait.

The logic in this example is invalid because it ignores the possibility that other creatures besides men might like cheese. To make the logic airtight, the major premise should have read "Only men like cheese," or "Anything that likes cheese is a man." Then the untruth of the major premise would be obvious.

Note: My example for this one was a bit silly, but here's an example many Americans took seriously during the 1950s:

M: Communists want us to reduce the defense budget.
m: President Eisenhower wants to reduce the defense budget.
C: President Eisenhower is a Communist.

The middle is "undistributed" because there might be other people than Communists who want defense appropriations reduced.

A less scary

Example: "Necessities of life shouldn't be taxed, so I don't think they should tax Twinkies." The unstated minor premise is that Twinkies are a necessity of life. It might be true that Twinkies should be exempt from sales tax; it's just that the argument above is a crummy way to prove it. No pun intended, of course.



38. Illicit Major (illicit process of the major term). The predicate of the conclusion talks about all of something, but the premises only mention some cases of the term in the predicate.

Example: "All Ontarians are Canadians, and no Québécois is Ontarian, so no Québécois is Canadian."

M: All Ontarians are Canadians.
m: No Québécois is Ontarian.
C: No Québécois is Canadian.

Counter: Show that the predicate, the object of the sentence, is different in the premise(s) and in the conclusion. Here, "Canadian" in the conclusion refers to all Canadians, the conclusion being that none of them are from Quebec. But in the major premise, "Canadians" refers only to some Canadians, those living in Ontario.

The new term "predicate" may be obscure; it's the verb and object of a sentence in English. The other part is the "subject." In the sentence, "No Québécois is Canadian," "Québécois" is the subject and "is Canadian" is the predicate.



39. Illicit Minor (illicit process of the minor term). The subject of the conclusion talks about all of something, but the premises only mention some cases of the term in the predicate.

Example: All best-selling authors are financially successful, and all best-selling authors are over thirty, so everyone who is over thirty is financially successful.

Translation:
M: All best-selling authors are financially successful.
m: All best-selling authors are over thirty.
C: All people older than thirty are financially successful.

Counter: Show that the conclusion refers to all of a group while one or more premises refers to only part of that group. The subject of the conclusion is "All people older than thirty." But the subject of the minor premise--All best-selling authors--only refers to some people in that age range. Of course there are many people in that age range who are not financial successes; thus the fallacy.



40. Exclusive Premises. A syllogism has two negative premises. A "negative premise" is a statement of the form "No X are Y" or "Some X is not Y."

Example: No Parisians are Americans, and no Americans are French, so no Parisians are French.

Translation:
M: No Parisians are Americans.
m: No Americans are French.
C: No Parisians are French.

Counter: Find a case, like the one above, where the premises are true but lead to a false conclusion.



41. Drawing an Affirmative Conclusion from a Negative Premise. Here at least one of the premises is negative, but a positive conclusion is illegitimately drawn from it.

Example: All kittens are animals, and some animals don't kill and eat humans, so some kittens must kill and eat humans.

Translation:
M: All kittens are animals.
m: Not all animals kill and eat humans.
C: Some kittens kill and eat humans.

Counter: Cite a case, like this one, where the true premises are combined to produce an absurd conclusion.



42. Existential Fallacy. A particular conclusion is drawn from universal premises.

Example: All purple people are humans, and all humans have rights, so some purple people have rights.

Translation:
M: All purple people are humans.
m: All humans have rights.
C: Some purple people have rights.

Counter: Point out that there aren't any purple people. To eliminate the technical fallacy you could rephrase the argument as, "All humans have rights--If purple people existed, they would be humans--Therefore, if any purple people existed, they would have rights."



FALLACIES OF EXPLANATION

These are fallacies tied to how empirical evidence is interpreted.



43. Subverted Support. The phenomenon being explained doesn't exist.

Example: The reason Randy molests children is that he was molested himself as a child.

Translation:
M: All people who molest children were molested themselves when they were children.
m: Randy molests children.
C: He does so for that reason.

Counter: Show that the minor premise is false. If, in fact, Randy has never molested any children, there is no need to explain why he did.



44. Non-Support. Evidence for the phenomenon being explained is biased.

Example: "The reason everybody who matters buy my books is because I am universally loved by people who matter."

Translation:
M: Everyone who matters buys books written by the people they universally love.
m: Everyone who matters buys my books.
C: I am loved by everyone who matters.

Counter: Clearly this hinges on whether "everyone who matters" was properly counted. If the arguer defines "people who matter" as "people who buy my books," the argument is circular. If only nenbers of the arguer's family buys his books, one can question why only his family constitutes "people who matter." Etc.



45. Untestability. Evidence for the phenomenon being explained cannot be tested.

Example: "Mars was originally a cube rather than a sphere, but aliens carved it into a sphere. They removed all traces of the carving, so the fact that no planet-scale corners remain in Mars's orbit only shows how efficient the aliens were."

Translation:
M: Aliens carve planets in a way that leaves no traces.
m: Mars is a planet, and has no traces of carving.
C: Therefore aliens carved Mars.

Counter: Show that the argument, while technically valid, can never be proved or disproved. If there is no way to test the theory, it's a useless theory.



46. Limited Scope. The theory which explains can only explain one thing.

Example: "Judge Crater disappeared because a dimensional gate, never seen before and never to reappear since, opened up and snatched him."

Translation:
M: One-time miraculous events can make people disappear.
m: Judge Crater disappeared.
C: A one-time miraculous event did it.

Counter: Note that a theory which only explains one thing is not really a useful theory. Technically, this theory can be empirically tested; discovering the Judge's body in circumstances proving the Mafia offed him would disprove the dimensional-gate theory. But it might be that no evidence will ever, in fact, come to light, in which case this theory isn't much help. If it were universalized--"all people who disappear do so because one-time dimensional gates snatched them"--one could then test it more easily by showing at least one case of a person who disappeared for a different reason. Even "some people disappear, etc." could be tested in principle.



47. Limited Depth. The theory which explains does not appeal to underlying causes.

Example: "My dog likes Alpo because he's a dog."

Translation:
M: Dogs like Alpo.
m: My dog is a dog.
C: He likes Alpo.

Counter: Show that, while technically valid, the theory doesn't really explain anything. Why do dogs like Alpo? Assuming they do, on which I have no information. What does their being dogs have to do with it? Why does my dog in particular like that brand of dog food? "Because he's a dog" doesn't really tell me anything.



FALLACIES OF DEFINITION

These are fallacies which involve poor definitions.



48. Too Broad. The definition includes items which should not be included.

Example: "Apples are red fruits."

Translation:
M: Any red fruit is an apple.
m: This thing here is a red fruit.
C: It must be an apple.

Counter: Show that the definition also fits other things which are not the subject of discussion. In this case, "red fruits" could include raspberries, strawberries and cherries as well as apples.



49. Too Narrow. The definition does not include all the items which should be included.

Example: "A book is something with words, pictures, and conversation. There are no pictures in Jane Austen's Pride and Prejudice, so it's not really a book."

Translation:
M: All books contain words, pictures, and conversation.
m: Pride and Prejudice includes no pictures.
C: Pride and Prejudice isn't a book.

Counter: Show that there are things which everyone agrees should be called by the word, but which lack things from the given definition. In the example above, Pride and Prejudice is obviously a book, yet it has no pictures.



50. Failure to Elucidate. The definition is more difficult to understand than the word or concept being defined.

Example: "Heliumites are people from Barsoom."

Translation:
M: All Heliumites come from Barsoom.
m: These folks are Heliumites.
C: They must come from Barsoom.

Counter: Show that the explanation is just as obscure as the thing people want explained. In this case, people who don't know what "Heliumites" are usually won't know what "Barsoom" is either. Okay, ya dragged it out of me--Barsoom is the Martian name for Mars in Edgar Rice Burroughs's Mars books, and Helium is its most powerful nation-state.



51. Circular Definition. The definition includes the term being defined as part of the definition.

Example: "Insects are grasshoppers if and only if they are made of grasshopper parts."

Translation:
M: Things made of grasshopper parts are grasshoppers.
m: These little beasties are made of grasshopper parts.
C: They must be grasshoppers.

Counter: Show that the definition includes the term being defined. Someone who can't identify a grasshopper won't be able to recognize grasshopper parts, either.



52. Conflicting Conditions. The definition is self-contradictory.

Example: "You can't be hired as an astronomer unless you have experience as an astronomer."

Translation:
M: You can only be hired as an astronomer if you have astronomical experience.
m: You don't have such experience.
C: You don't get the job.

Counter: Show that the conditions in the definition conflict. If you can't be hired without experience, it's equally true that you can't get experience without at some time having been hired. If this definition were literally true there would be no astronomers; the field would be empty.





Page created:04/01/2017
Last modified:  04/04/2017
Author:BPL