While it is true that the practice of the arts is the highest of human pursuits, this does not exempt such practices, and their practitioners, from engaging in some of the Lowest Common Denominator (LCD) practices that non-artists engage in. By this I am referring to the trap of falling into a logical fallacy.
Logical fallacies are those things that we all sort of understand, until our feet are held to the fire, so before I explore logical fallacies, and present one of the most daunting to the field of art, let me first expound upon what they are, how they are used, and give a good example of such in the arts, as shown in a recent popular post on this website.
Perhaps the best place to start is the top website, for logical fallacies. It’s definition of a logical fallacy is the simplest and most unencumbered one you will find:
A logical fallacy is, roughly speaking, an error of reasoning. When someone adopts a position, or tries to persuade someone else to adopt a position, based on a bad piece of reasoning, they commit a fallacy.
Having established what a logical fallacy is, it might be interesting to note the most popular general logical fallacies. This webpage has a fine list of what it considers to be the Top 20 logical fallacies. These include many that you will find in the comments sections of blogs, like the Huffington Post, and include ad hominem — which is not attacking a claim, but the claimant, i.e., ‘Well, look at the person making that argument, he’s not that smart, etc.’ Another is the appeal to authority fallacy, which is akin to: ‘Well, if Steven Pinker/Richard Dawkins/Roger Ebert says that theory/religion/film is bad then it must be.’ Two others that are especially popular in online dialectic are strawmanning and the moving goalpost. The former is basically when someone has stated position A, and their foe cannot argue against it logically, so he pretends that his opponent has not stated A, but B, which he can logically defeat, even though that was never the basis for his opponent’s claim. The latter is when someone has proven their claims to a reasonable standard, and, dissatisfied with having lost the argument, the opponent claims even more proof, or more detailed proof, is needed.
As for the provenance of such methods, there are two ways to approach this. The first is the personal, and it is germane to note that human beings are scared, willful little creatures, and often they cannot accept that they are wrong in anything, Therefore, if they lose an argument, they cannot accept that they were simply wrong. The other side may have won, but, damn it all, the other guy had to cheat. Therefore, the need for a scapegoat is created, and that cheater also had to have had a weapon in their dirty arsenal. Therefore comes the rationale for the creation of the logical fallacy, as a weapon to protect the ego from having to acknowledge that someone else knew a bit more than you did in a given instance.
The second is the cultural and/or historical—and here things get a bit more interesting. The first of the great philosophers to tackle not only big ideas, but the way people argued those ideas, was Aristotle. He compiled them in a work called On Sophistical Refutations. The title for the book actually came from the term for wandering teachers in ancient Greece, for sophist is from the goddess Sophia, who represented wisdom. It’s no small irony that the goddess’s name came to represent its opposite—bad reasoning, or sophistry.
Now, on to the example. A recent article on the Huffington Post, called The 15 Most Overrated Contemporary American Writers, by Anis Shivani, has generated much traffic and many comments. As with most such lists, comment has been furious and negative, usually with comments claiming the writer neglected this hack or had impudence for including that hack. By and large, the list is pretty good, although one could tally a list a hundred or a thousand times the length, due to the MFA writing mill scams, and still not get all the names of all the bad writers clogging up bandwith and contributing to deforestation.
The article and its comments are exemplary models of logical fallacies, online, and in action. First, let me tackle a comment that exhibited a typical logical fallacy. The commenter chided Shivani on using a dangling modifier in his mention of nature poet Mary Oliver, who actually was once a good poet, but has not written anything of worth since Ronald Reagan left office. Shivani’s ‘solecism’?:
‘America’s best-selling poet along with Billy Collins, and that tells you all you need to know about how the public views American literature.’
Now, you may ask, what exactly is wrong with Shivani’s sentence? Well, what is a ‘dangling modifier’?
Incorrect: While driving on Greenwood Avenue yesterday afternoon, a tree began to fall toward Wendy H’s car.
(It sounds like the tree was driving! This actually appeared in a newspaper article. An alert reader wrote, “Is the Department of Motor Vehicles branching out and issuing licenses to hardwoods? Have they taken leaf of their senses?”)
Adding a word or two makes the sentence clear.
Correct: While Wendy H was driving on Greenwood Avenue yesterday afternoon, a tree began to fall toward her car.
When a modifier “dangles” so that the sentence is meaningless (or means something other than your intent), restate it and add the words it needs in order to make sense.
Seems like it’s a good thing, no? Well, in fact, no, it is not, because it assumes that the reader has no ability to figure out what is being stated. As an example, in the sentence by Shivani, a grammar Nazi would state that the modifier, ‘America’s best-selling poet along with Billy Collins,’ is incorrect because its meaning is muddled, since we do not know who is being referred to. The ‘technically’ correct rephrase would be something like, ‘Mary Oliver is America’s best-selling poet along with Billy Collins.’ But, logically, this is nonsense, since both the page with Shivani’s piece, and the box with the writing, both have Oliver’s name and photo. But, since such constructions usually follow a sentence or paragraph in which the subject or noun is mentioned, the claim of a dangling modifier as an error is usually inane, tendentious, and picayune, at best, and flat out logically wrong, at worst. And the reason that many people use such techniques is as a shield, which is another form of fallacy, although not a logical one, but an egoistic one. They cannot pick upon anything wrong with the writing’s style or ability to leave a deep impression, create a character, or simply elucidate a beautiful and/or non-trite scene, so they get picayune about something they are usually wrong about anyway. This makes the claimant not only a grammar Nazi, but, technically, a prick.
Similar so-called ‘errors’ include the dread dangling participle and the sinister split infinitive. But, like the dangling modifier, neither is a serious logical problem. A typical example of a dangling participle is something like:
After being whipped fiercely, the cook boiled the egg.
To the grammar Nazi, alarums of confusion abound as to who is being whipped—the cook or the egg. But even a toddler awaiting breakfast, if told such a thing, would have no logical problem deducing that it was, indeed, the egg that was the whipped thing. As for split infinitives? Well, look no further than Star Trek, the popular television and film sci fi franchise, whose motto was ‘To boldly go where no man has gone before!’ OK, so what is wrong with that sentence, logically or not? Turns out nothing, for the way a grammar Nazi would rewrite it—‘To go boldly where no man has gone before!’ is no less logical, for the adverb ‘boldly’ is still modifying the verb go, whether or not it precedes or succeeds it. In short, most grammar Nazis, like the commenter who chided Shivani, note supposed ‘facts,’ but lack the wisdom to correctly apply them; which, naturally, rents the whole project. It’s akin to the many bad editors, online or offline, who will follow grammar Nazi tactics, all the while letting the most egregious forms of bad writing—stereotyping, banalities, poor grammar, poor punctuation, clichés, poor dialogue, needless repetition, needless description, etc.—go uncommented upon. This is why MFA programs often churn out MLS writers who can construct Chicago Manual Of Style business letters, and nothing else (see the prosists Shivani named); that’s when they are not churning out people who indulge the very same ills I described in the last sentence (see James Frey, Dave Eggers, David Foster Wallace, T.C. Boyle, and Joyce Carol Oates).
Of course, just because Shivani suffered at the hands of a logical fallacy does not mean he was not capable of committing one, himself:
If we don’t understand bad writing, we can’t understand good writing. Bad writing is characterized by obfuscation, showboating, narcissism, lack of a moral core, and style over substance. Good writing is exactly the opposite. Bad writing draws attention to the writer himself. These writers have betrayed the legacy of modernism, not to mention postmodernism. They are uneasy with mortality. On the great issues of the day they are silent (especially when they seem to address them, like William T. Vollmann). They desire to be politically irrelevant, and they have succeeded. They are the unreadable Booth Tarkingtons, Joseph Hergesheimers, and John Herseys of our time, earnestly bringing up the rear.
So, having laid out a good case, in general, against the writers he named, when he is to reveal the rationales behind his motives, we see that Shivani is basically chucking darts, for his definition of good writing is made up of things that have little technical merit, and far more of his own personal likes and dislikes (not unlike the flaws made by B.R. Myers in his famed, but flawed and often silly, polemic, A Reader’s Manifesto, published in the Atlantic Monthly, in 2001). In short, in toto, Shivani has committed the overall fallacy of the non-sequitur, for his claims about good writing do not follow from his examples.
I could go on for many pages, giving examples, but, whereas I give technical and concrete examples of what are things that constitute poor writing, Shivani mostly lists techniques, not definitive examples, which can be used for the good or the ill. By contrast, a cliché can only be used for the ill. If it is subverted, as example, it is no longer a cliché. In a show of good will and humanity, I’ll avoid didacticism and just take the two that I find most annoying: obfuscation and a lack of a moral core. On the former, one need only look to the greatest of published poets, such as Hart Crane or Wallace Stevens, to see that they often effectively used obfuscation to force a reader to examine the thing obfuscated and/or the obfuscatory mode. What Shivani really meant to say was ‘confusion,’ not obfuscation. It is a sign of bad writing, or art, when an artist does not know what they are doing, such as the director who admitted as much about this film. But obfuscation can be used poorly or well.
But an even more grotesque and outrageous logical error is made when Shivani tries to equate morality with artistic quality. Quite simply, the artistic quality of a work has no bearing on its content, moral (a dubious term in itself) or otherwise. Mein Kampf, as example, is a horrid piece of ethical bilge, but it’s a bad book not because it is racist, stupid, and the work of a psychotic mind. It is bad because it is ill wrought, stylistically, suffering from many of the ills I described above. To go to the other extreme, there is writing that is good, but, at its ethos, is dishonest to the max. The poetry of Rainer Maria Rilke, as example, is filled with great writing, technically, as well as many ethical claims and pieties. But, as the poet—who was an adulterer, liar, and absentee father, and had many other personal flaws—manifestly failed to live up to his art’s claims, his writing was, to its core, hypocritical and unethical. Yet that makes none of his great poems any the less great for the artist’s and/or art work’s ethical ambiguity. Ethics and art simply exist in wholly discrete domains, what paleontologist Stephen Jay Gould called Non-Overlapping Magisteria (NOMA), although he was specifically referring to science and religion. Thus, Shivani commits another logical fallacy, that of conflating the artist with the art work, as if they were indissoluble; a form of the Straw Man fallacy. If a sonnet is great, by definition, it matters not whether its poet was Rilke, Elizabeth Barrett Browning, or Dr. Mengele.
This leads me to one of my own pet peeves, the fallacy of self limits; especially as applied to the arts. I have seen this exhibited many times by fellow artists and critics, who claim that because they are limited, and cannot see or understand something, therefore it is not so. The most usual example of this comes when someone takes up the Postmodern claim that everything is subjective, therefore all claims of quality are immanently biased by the claimant’s past, psyche, biases, sex, sexual preference, race, religion, height, childhood trauma, etc. This is patent nonsense. Sure, everyone has biases, but recognizing those biases is the key to objectivity. One may be in a subjective place when arguing who was the greater poet: Hart Crane, Rainer Maria Rilke, Wallace Stevens, etc. But one is perfectly objective arguing that any of those three poets is a better/greater poet than any of the poetasters Shivani mentioned in his article. There simply is no comparison. Subjectivity exists, but in the small gaps between easily identifiable (i.e. objective) quanta.
A good example of this fallacy writ large, online, comes from a blog post by film critic Roger Ebert, that appeared on his blog late last year. In it, Ebert settled a bet between two friends who were arguing over whether I was a good critic, or a critical provocateur along the lines of New York based film critic Armond White. Numerous examples of my criticisms of films criticized by Ebert were given as examples, and Ebert ended up praising me, my website, and my writing, despite my dissents from his opinions. Yet, naturally, almost none of the hundreds of commenters on the blog thread, for the first few months, could get over their icon being criticized, so engaged in many forms of logical fallacies against me and my writing. The biggest and most repeated fallacy, though, was the claim that I did not believe in subjectivity. This is manifestly false. I do believe in subjectivity but not in absolute subjectivity, as the lazy Postmodernists do.
And, of course, I have never claimed that one can be objective 100% of the time. There are quite a few reviews where I put a disclaimer about some personal connection I have with a film, but those are so rare that there is absolutely no possible biased conflict of interest I could hold against a film or filmmaker. Perfection is not attainable, but one, as a critic especially, must strive to be objective. A critic owes it to his readers to serve their needs, not his own. If I can claim 99 out of 100 reviews of mine are objective then that’s good; but that does not negate the one time human frailty damns such perfection, so I can say I am objective, with the unneeded and condescending statement that there will be rare occasions I am not. It’s as if one were to claim that Halle Berry were not a gorgeous woman because, every few days, after a bad night’s sleep, she wakes up and looks only so-so. It’s silly. So, let me state this, which I’ve elucidated before, and this is the fallacy of self limits:
The critic who claims there are no objective criteria on which to base an opinion of something is really stating their own inability in doing so; even as they try to make others seem like it is I who am saying there’s no such thing as subjectivity. I’ve never said that. All I’ve done is counter the Postmodern claim that there is no objectivity. There certainly is. And it only takes one objective fact to objectify the whole world in parallel to it. Those who claim that all is subjective are a curious lot, for if they really believed that then they would not argue the point, since its arguing belies their belief in its objective nature. If all is subjective, after all, then all viewpoints are equally viable. But that is, again, patently silly. Objectivity need not be total, but subjectivity must, especially when broadbrushed in such a manner. Imagine a Pacific Ocean of pure water. This is subjectivity. Drop a single drop of blood into it, and it’s no longer pure. Objectivity exists in such a manner, although the ration is much closer to being all objective, with subjective patches, here and there—i.e., one can argue the subjective differences between like quantities but not unlike quantities. The whole belief that all is subjective is mere dogma, an ideology with no basis in the real world, i.e. a logical fallacy. The fallacy of self limits.
In other words, one can enjoy or not enjoy a film, on one level, and emotionlessly evaluate it on another simultaneously—this is called multi-tasking; but the point is that enjoyment is not fundamentally connected to quality. This is why one should never argue one’s subjective likes, only objective quality. Sometimes, people will try to counter the fallacy of self limits with another Straw Man argument, such as claiming not subjectivity, but taste. Yet, physical taste is a wholly different matter from artistic taste. All taste buds, on every person’s tongue, are wired wholly differently; therefore one can taste the same thing, yet like it as something else. In the arts and sciences, brains recognize that words (to use the case of writing) have distinct meanings. They are grounded in something that any human brain can share the same meaning of, unlike a tastebud. Therefore they are objective.
And it is because of objectivity’s damning ability to just be, despite human will to change everything to suit the human’s needs, that one gets all the logical fallacies hurled at one, or at one’s opinions of art. Often this devolves down to grammar Nazis, who will try to chide me for ‘bad writing,’ not because my work is filled with clichés, or other objective criteria, but because of the aforementioned (and ironic) subjective ‘solecisms’ of grammar based in no logic. I often get emails over why my early essays used the number 1 over the word one, or why I still prefer to use a single dash/hyphen to an en- or em- dash or hyphen, or the single word alot over the two words a lot, but that’s a stylistic preference, not any indicator of the quality of the writing (although I could argue that, in the former, my dash-hyphen adds ambiguity to the sentences, and often serves a dual purpose, and, in the latter, that my form of the term is prescient, as many words that begin with the prefix a- started out as two words).
But, none of that actually touches objectivity, thus logical fallacies proliferate, especially in the arts, because of it. And, not so oddly, this is perfectly logical, if utterly wrong.Powered by Sidelines