Previous: Honesty, Flattery, or Fealty?
Up
Next: Separated at Birth?

In defensionem iacentis felem mortuum
(In defense of heaving dead cats):
The Art of Persuasion

Peter A. Taylor
August 23, 2013




1. Conflicting advice

I have a confession to make: I like H. L. Mencken. But I feel guilty about it. Mencken's use of ridicule and verbal abuse clashes with my ideals of civility and rational scientific debate. Is it in my interest to use ridicule and abuse? Is it effective? Is it morally permissible? Experts in the art of persuasion give conflicting advice:

Mencken wrote,

The pedant and the priest have always been the most expert of logicians — and the most diligent disseminators of nonsense and worse. The liberation of the human mind has never been furthered by dunderheads; it has been furthered by gay fellows who heaved dead cats into sanctuaries and then went roistering down the highways of the world, proving to all men that doubt, after all, was safe — that the god in the sanctuary was finite in his power and hence a fraud. One horse-laugh is worth ten thousand syllogisms. It is not only more effective; it is also vastly more intelligent.

Saul Alinsky wrote (Rules for Radicals),

Rule 5: Ridicule is man's most potent weapon. It's hard to counterattack ridicule, and it infuriates the opposition, which then reacts to your advantage.

Dale Carnagie wrote in How to Win Friends and Influence People that you can't win an argument with a customer. If you want to sell something, you have to be agreeable.

Jonathan Haidt says something similar, that effective arguments involve cajolery, aimed at the emotional "elephant" rather than the rational "rider". (He explains the metaphor in the linked chapter of The Happiness Hypothesis, but goes into more detail in The Righteous Mind.)

Benjamin Franklin wrote,

A Spoonful of Honey will catch more Flies than a Gallon of Vinegar.

St. Francis said,

Preach the gospel constantly. If necessary, use words.

The delicately nuanced Ann Coulter lifts the veil ever so slightly, hinting at a sort of combined arms campaign:

Leave the name-calling to professionals.

Alan K. Henderson went so far as to develop a set of rules for name-calling.

Nicholas Phillipson said,

Exchange when it's discussed in TMS [The Theory of Moral Sentiments] is about trading — we trade sentiments with each other, looking for a sort of psychological deal; it's nice to have a discussion with people and to feel that what we are saying is regarded with sympathy. We relish the process of trading our ideas. It's what happens in any tutorial at any university, any conversation in a park. The process he's describing in TMS and that regulates our social lives is exactly the same process which he is discussing when we trade our goods and services. He says: We spend all our lives trying to persuade people. That's why rhetoric matters.

Then there is the advice of the Roman general, Scipio Africanus Sun Tsu:

Build your opponent a golden bridge to retreat across.
 
 

2. Clarifications

In order to answer these questions, I have to unpack them a bit. What exactly am I proposing to do? What are my motives? What are my audiences' motives? How does the human reasoning process work?

By "ridicule", I am referring to an emotional attack on a person or a belief, rather than an appeal to logical argument (e.g. a proper syllogism). Technically, these are not valid arguments. They are ad hominem fallacies or emotional appeals. In an ideal world, they ought not to be effective or necessary. These attacks may be directed at the person I am trying to influence, or his beliefs, but more likely, they are directed at someone else.

My immediate motive is presumably to persuade someone that my position is correct. But why do I care what other people think? My situation could be complicated. Maybe the person I'm trying to persuade is actually myself, and the proposition I'm actually trying to advance is about my intelligence or moral stature rather than the topic nominally under discussion. Or maybe I'm more interested in persuading other members of my tribe of my loyalty than in persuading my nominal target audience that I have the facts right.

And why does my audience care what I have to say? I may be asking them to do something that they do not individually have the power to do, and they may lose face by even listening to me.

And what do I mean by "persuade"? This gets into the nature of belief systems. I may be trying to get someone to act in the face of uncertainty. This means either changing his perceptions of the odds that my proposition is true or changing his perceptions of the consequences of a "wrong" answer. Think of Pascal's wager.
 


 

3. Why rational arguments don't always work

Modern science is one of the most important inventions of human civilization. But the reason it took us so long to invent it and the reason we still haven't quite understood what it is 500 years later is it is very hard to be scientific. Not because science is "expensive" but because it requires a fundamental epistemic humility, and humility is the hardest thing to wring out of the bombastic animals we are.

Pascal-Emmanuel Gobry

In an ideal world, political and other debates would be conducted by rational arguments. Often there would be some uncertainty as to the facts under discussion, but this would be addressed through the application of Bayes' theorem. We might have biased initial probabilities, but everyone would be honestly trying to update these probabilities with unbiased information.

In the real world, people behave the way Jonathan Haidt described them in The Righteous Mind: There are things we must believe or must not believe because the evidence is overwhelming, whether we like it or not. But there is a vast middle range where we may believe; where we can plausibly claim what we like. Biases are important where the degree of plausible deniability of the relevant facts is high.

Bias can be manipulated in a number of ways including (1) by creating an illusion of consensus, (2) threats, (3) by feeding people cherry-picked anecdotes (biased data sets) and taking advantage of the fact that most people aren't very good at statistics, or (4) using inconsistent standards of evidence. "Biased data" does not have to be false, and if one gets one's anecdotes from like-minded people (e.g. Facebook friends), one can easily acquire highly biased sets of anecdotes without any conscious intent to introduce bias.

Communities that create an illusion of consensus are sometimes known as "echo chambers". As Michael Totten wrote, "An echo chamber is an invisible mind prison." Along with biased data (and outright falsehoods), in these echo chambers, one also gets bad arguments: put downs, or easily rebutted fallacious arguments, but presented in a context where the rebuttals can be suppressed or ignored. This allows one to pretend that a string of abusive ad hominem attacks are legitimate objections to the credibility or legitimacy of opposing views, or that a string of bullshit is a compelling proof.

But uncertainty is one thing; irrationality is another. The effectiveness of different kinds of rhetoric depends not only on the degree of uncertainty over the relevant facts, but also on the kinds of incentives people have for getting the right answer. If the right answer is a private good (the decision-maker pays for his own mistakes), a rational argument may be sufficient even if it involves statistics. But often, as in politics, the costs of a bad decision are dispersed over a large group or borne by a third party. Here the listener is more likely to be interested in flattery (status competition) or displaying group loyalty than in tangible gain or loss. (I like to imagine that my political thought is driven by an aversion to cognitive dissonance, but perhaps I flatter myself. Only my neuropsychologist's Positron Emission Tomagraphy scanner knows for sure.) The degree of a listener's emotional investment in his current beliefs or his group affiliation, and the group's investment in its current beliefs, largely determines his persuadability in these cases.

But it is not just the listener who is biased. The person making an argument may be trying not to overcome his listener's biases but to manipulate them. Persuasive speakers may find themselves in several very different situations:

There is also the question of how to deal with appeals to authority. Even in a scientific argument where test results are supposed to be reproducable, authority matters because we don't have time or enough motivation to redo all the research in the history of the world. In legal procedures, there are witnesses whose testimony is not reproducable. Discrediting an opposing witness is a legitimate legal tactic. Eugene Volokh had some good points in his slippery slope article about ad hominem arguments and appeals to authority.

Regarding my audience, and choosing an appropriate set of expert advice on ridicule, I find myself engaged in a sort of triage. One class of listeners may be persuadable by rational argument. A second class may be persuadable, but only if I provide some sort of emotional carrot and stick. A third class may be too heavily invested in their position to be persuadable.

 
 

4. Biased sampling procedures

In a war there can be no philosophical innocence (and there has never been philosophical innocence). Even when epistemology pretends to concern itself with things that we just happen not to know, its objects infect it with dissimulation, camouflage and secrecy, making it complicit in the transmission of the lie. It plays out war games of concealment and exposure, disinformation, distraction, and feint, entangled in the complex skein of signal manipulation and evaluation known to all militaries as 'intelligence'.

Nick Land, Philosophy in a War-Zone

All propaganda is lies, even when one is telling the truth.

— George Orwell

One aspect of verbal abuse has to do with biased data. I may want to convince you that my political opponents are statistically more likely than my allies to be jerks and liars. I may do this by various forms of character assassination, and by misrepresenting the very most extreme jerks that have ever associated with any of my political opponents as being generally typical of all of them. Political discourse consists largely of using anecdotes to substitute when proper scientific data are not available. Data sets are often not enumerable (e.g. the set of all naughty human actions), and the interpretations are subjective. John Derbyshire has a nice rant about the distinction between anecdotes and data. Basically, both are reports of events. The difference is that "data" implies that my sampling procedure is reasonably free of bias, or at least, free of deliberate bias. "Anecdote" implies that my sample may have been chosen specifically to illustrate a point, and may be a "man bites dog" story. The problem is that in politics, judging character or various outcomes is subjective. There are no unbiased data, and even if there were, the human mind isn't good at making the distinction between biased and unbiased samples. Thus, much of political activism (including journalism) consists of feeding people cherry-picked samples (anecdotes) and encouraging them to interpret these as unbiased samples. But it would also be wrong to mischaracterize a set of anecdotes as being unrepresentative when they really do illustrate a trend, which was Derbyshire's point.

The difference between the way the Koch brothers are treated in left-leaning media and the way the Tides Foundation is treated is an example of bias through cherry-picking anecdotes. Molehills become mountains and mountains become molehills.

The recent scandal involving Edward Snowden and the NSA surveillance of American citizens, juxtaposed against the IRS harassment of the Tea Party movement, disturbs me largely because of the potential for the manipulation of electoral politics through selective leaking of embarrassing information. If everyone has the same amount of information about what political figures on both sides of the aisle are doing, and if the laws are enforced in an evenhanded manner, then it's not so big a deal whether people have a lot of privacy or not. But if one side owns all of the data and all of the megaphones, they are in a position to introduce an insuperable amount of bias into the electoral process. As an example, consider on the one hand how Senate candidate Barack Obama was able to get his Republican opponent's (Jack Ryan) "sealed" divorce records "unsealed", and on the other hand, even into his second term as President of the United States, how little voters know about Obama's past (i.e. his time at Columbia University is the proverbial riddle wrapped in a mystery inside an enigma).

Speaking of the 2012 presidential election, I note that 93% of black voters supported President Obama. (I also note that Florida is a large swing state.) Might a bit of racial hysteria help with black turnout? But Trayvon Martin might not have made such a good poster child for the anti-racism movement if Officer Charles Hurley's shenanigans regarding the burglary and drug evidence were widely known. John Derbyshire observes,

If Miami Schools Police Department hadn't been cooking the books to make themselves look good, Martin would still be alive, even if possibly in jail....

A real piece of work, Officer Hurley. But no-one seems to have thought to connect him to Trayvon Martin, for whose death he bears some indirect responsibility. Where were America's investigative journalists? Trying to find evidence that George Zimmerman forgot to feed his goldfish back in 1983, that's where.

James A. Donald, thinking along similar lines, wonders whether the Catholic church can survive in the long term in the face of biased anecdotes promoted by a hostile press corp. He writes,

Anyone in the Roman Catholic Church can, and quite possibly will, have a big news story based on six degrees of separation to pedophilia dropped on him. Suddenly it is on the six o'clock news that Bishop X is some how connected to Y, who is somehow connected to Z, who is reported, forty years after the alleged incident, to have fondled a little boy.

Biased selections of anecdotes can be in play for a number of reasons. There may be some natural human cognitive biases that cause certain events to be remembered more readily than others. It is also possible that one side in a debate has more scruples than the other side, such as a sincere Christian who thinks false witness is sinful vs. an Alinskyite radical who thinks that all's fair in love and war, and all politics is the moral equivalent of war. But the usual cause of lopsided anecdote flooding is that one side has a bigger megaphone than the other.

This raises questions about what the sides are in the argument and who owns most of the megaphones. The obvious labels for the sides in a political argument in the US are "Democratic" and "Republican", but as the recent IRS harrassment scandal illustrates, the targets for the harrassment were not establishment Republicans but Tea Party activists, and several of the senior IRS people involved were Republican appointees. If we substitute "liberal" (what Europeans would call "social democratic") and "conservative", that doesn't necessarily clarify the picture very much. Radley Balko, reacting to complaints about "liberal" media bias, says "The Media Aren't Liberal." They're statist. They're statist even on issues like drug policy and eminent domain where the American left is sympathetic to libertarians. So we need to be appropriately suspicious of political labels. But let's move on to owning the megaphones.

The mainstream news media tend to be overwhelmingly Democratic Party sympathizers. Nick Land notes that many journalists and Democratic Party politicans are literally married to one another. Journalists' political campaign contributions appear to favor Democrats over Republicans by some 88% to 12%. (Beware cherry-picking. Are these representative numbers? A quick web search shows claims of bias that are all over the map. I've seen claims that there is a net conservative bias. I've also seen claims that bias varies from "liberal" outlets like MSNBC that are 99% Democratic to "conservative" outlets like the Wall Street Journal that are only 85% Democratic, with the average on the order of 95%. But campaign contributions seem like a relatively objective measure.) Government employees (Mickey Kaus estimates 70%) also tend to support the Democratic Party (presumably less so in the military and possibly NASA). Washington DC is a spectacularly one-sided safe area for the Democratic Party; the "reactionary" Foseti suggested, "The only real way to get the IRS not to target conservative groups would be if Obama had asked them to do it" (they are reportedly jealous of their "independence"). The Washington Examiner reports,

Of the IRS lawyers who made contributions, a whopping 95 percent gave to Obama. And if you think that's a high percentage, 100 percent of the lawyers at the Department of Education, the United Nations and - no surprises here - the National Labor Relations Board (you know, the pro-union agency that sued Boeing) contributed to the Obama campaign.
The set of government employees noteably includes public school teachers, who supposedly teach future voters what they need to know in order to vote wisely. The entertainment industry and academia are also notoriously left-wing (even economists are mostly Democrats). This conglomeration of news media, entertainment, and academia, is known collectively as The Cathedral (or The Clerisy, if you're a Tom Wolfe fan). (Mencius Moldbug mentions The Cathedral in his Gentle Introduction, but Is journalism official? may be more helpful. Shannon Love is also helpful. Update: the best discussion of The Cathedral is probably in Open Letter part 4.)

Update: The Washington Post reports that 7% of journalists are Republicans.

Leftists will often claim that the country is run by corporations, and that corporations are "conservative". But does the leftist vision of "conservatism" have anything to do with what self-identified "conservatives" actually believe in, such as laissez faire capitalism? There are some issues, such as pollution, where corporations often do want "laissez faire" in the sense of being "left alone" to pollute (as opposed to the people downwind being "left alone" in the sense of being legally protected from nonconsensual harm). But mostly what corporations seem to want is cronyism. "Corporations" include the New York Times (owned in part by hyper-rich Mexican telecom mogul Carlos Slim Helu, whose TracFone company coincidentally has received over $1.5 billion from the US government to supply the now-famous "Obamaphones"), government contractors, and large established businesses that benefit from regulation that suppresses smaller competitors. The theory that "corporations" are consistent champions of laissez faire flies in the face of everything we know about political lobbying. One can't plausibly argue that Krupp was opposed to a German military buildup in the 1930s. Angelo Codevilla notes, "...the upper tiers of the U.S. economy are now nothing but networks of special deals with one part of government or another." Here's a diagram of part of one such network. Here's another. This doesn't look to me like laissez faire. It looks to me like government and big business being joined at the hip.

At the grassroots level, much of political discourse consists of people trying to stuff the Bayesian ballot box while being convinced that they are heroically trying to restore balance in the face of the other side's stuffing of the box. Political partisans on FaceBook post anecdotes ad nauseum about how the opposing political party is full of cranks and assholes. Yet if anyone offers them examples of their own party members' crankery or assholery, they rise up in righteous indignation about how it doesn't count because everybody does this and you're a bad person for being so uncivil as to mention it. How can an honest person tell which side is trying to mitigate bias and which side is trying to increase it?

I don't think the average person has enough self-honesty to be able to do this. But if you want to take a stab at it, I would start by reading Eliezer Yudkowsky's essay on lonely dissent.

Lonely dissent doesn't feel like going to school dressed in black. It feels like going to school wearing a clown suit.

"Scientific" studies in fields where the results are hard to replicate function much like anecdotes. A snake oil manufacturer can commission a hundred studies of the effectiveness of its product that are reliable at the 95% confidence level, quietly throw away the 95 true negatives, and vigorously publicize the 5 false positives. Governments can produce similar effects by selectively issuing grants to researchers who consistently produce results that the politicians like (e.g. Joe Romm).

[Update, 3-12-2014: A good example of "stuffing the Bayesian ballot box" showed up on Facebook recently. This video is being promoted. This video is being sat upon. Yes, by the same L. A. Times that sat on the John Edwards scandal, leaving the serious political reporting to the National Enquirer.]
 


 

5. Facts vs. interpretations

Propaganda often consists largely of irresponsible accusations regarding other people's intentions, which are a form of abusive ad hominem attack. Mencius Moldbug has a nice post on Propaganda, by Jacques Ellul, with quotations:

It seems that in propaganda we must make a radical distinction between a fact on the one hand and intentions or interpretations on the other; in brief, between the material and moral elements. The truth that pays off is in the realm of facts. The necessary falsehoods, which also pay off, are in the realm of intentions and interpretations.

Similarly, Salman Rushdie described certain strains of Islam as "self-exculpatory and paranoid". "Self-exculpatory and paranoid" interpretations of events are just as good as factual lies, and much harder to disprove. If my car ran over your dog, how could I ever prove that it wasn't deliberate? And if, in a political negotiation with me, you didn't get everything you wanted, does this mean that I am being horribly abusive? What is the baseline against which this abuse is measured? If dissimilar people are treated in dissimilar ways, is this abuse? What if dissimilar people are treated more similarly than I want them to be? The "war on women" strikes me as an example of this sort of paranoia. It seems inconsistent with reported asymmetries in family law and in the cavalier manner in which male prison rape and false rape accusations are generally ignored. Glenn Reynolds responds to one such story:

THAT'S FOUR STRIKES TOO MANY: Woman is finally jailed after FIVE false rape allegations against her ex-boyfriends in eight years. The sentence seems awfully light, considering.

The judge, however, seems to have missed the most obvious victims — the men she falsely accused. Instead: "Judge William Gaskell told Black that her history of made-up rape claims had made it more difficult for genuine rape victims to be believed." Well, that's true, too, of course. But. . . .

Note Harry Frankfurt's book, On Bullshit. Frankfurt defines bullshit in terms of not caring about the truth, rather than outright lying. If one accepts Frankfurt's distinction, good propaganda may often be "bullshit" rather than "lying", because the person making unreasonable interpretations may not know for certain that they're false. (I would say that this is still a type of lying, because the propagandist is being dishonest about uncertainty, but I can see why the distinction might be useful.)

[Update: Handle introduced me to a useful word: Deutungshoheit, which he defined as "social norm moral-narrative air dominance". Alternately, "interpretational sovereignty". This term seems to combine elements of biased sampling procedure, interpretational shenanigans, and ad baculum arguments.]
 


 

6. Threats and bribes

Implicit threats (ad baculum arguments, or "appeals to force") and bribes may be more common than most people realize. Threats include not just threats of violence, but also threats of boycotts, social demotion, and ostracism. They need not be overt; making an example of someone is sufficient to threaten everyone else. Bribes are similar in their effects to threats; for present purposes, the distinction is not important, but threats are a bigger problem because bribes are usually more costly to the person offering them.

But it's not immediately obvious why ad baculum arguments should be effective. You can put a gun to my head and make me say that I believe in the Flying Spaghetti Monster (FSM), but how can you make me actually believe in it? Some more realistic examples would be Lysenkoism, the belief that immunizations cause autism, or the belief that my cow stopped giving milk because my neighbor is practicing witchcraft.

There are several answers to this question.

Pascal's wager takes some explaining. Belief is not a yes/no proposition; there are degrees of uncertainty, and degrees of committment to a proposition. You can't force me to flip a switch from absolute disbelief to absolute belief, but you can force me to invest reputation and "face" in a belief (e.g. by publicly endorsing it). You can force me to invest time and money in an organization that demands conformity to a belief. You can force me and my peers to produce and disseminate propaganda. I may know that my participation is insincere, but I may not know how many of my peers are sincere. Ad baculum arguments may also exploit a natural tendency for people to kid themselves about uncertainty.

You can also look at this in terms of Jonathan Haidt's must believe vs. may believe. Give people incentives that make them want to believe your narrative (e.g. by making your narrative flattering to them). As long as (1) there is significant uncertainty over the facts, or even plausible deniability, and (2) the costs of any errors are borne by other people, that's probably good enough. People can generally be relied upon to lie to themselves and others about uncertainty. When presented with contrary evidence, they will try to wriggle out of Haidt's mental "handcuffs".

When is it permissible, and when is it not permissible, to fire someone for going against the party line? Is it ever permissible? Yes. I don't have a problem with the Catholic church "silencing" Matthew Fox; the Catholic church is openly and honestly in the business of promoting "credence goods". (A credence good is a "type of good with qualities that cannot be observed by the consumer after purchase, making it difficult to assess its utility", such as the correctness of one's theological or moral doctrines.) If you have irreconcilable differences over the credence goods the Church is promoting, you need to find another employer. No one is forcing you to be a Catholic priest.

However, problems arise when people lie about the threats they are making and their reasons for and legitimacy in making them. Threatening to fire an insubordinate employee is legitimate. Threatening someone with character assassination (i.e. lying or bullshitting) is not. Honestly promoting credence goods is legitimate. Lying about one's faith and calling it "science" is not. Saying that you support freedom of religion in situations where people can't reasonably opt out, and then punishing people for questioning your credence goods is not legitimate.

In the context of verbal abuse, these sorts of problems often arise when some authority or clique is set up as a gatekeeper who controls social status. Mencius Moldbug offers Brad Delong as an example.

Many UR readers have had the priceless educational privilege of growing up behind the Iron Curtain. These readers will identify Professor DeLong's tone at once: it is the tone of the Soviet humor magazine Krokodil. I will take the liberty of Anglicizing, and call it "crocodile humor."

Crocodile humor is the laughter of the powerful at the powerless. It is not intended to be funny. It is intended to intimidate.

More generally, this phenomenon may be referred to as speaking power to truth.

Often threats are indirect, made against one's employer instead of, or in addition to, oneself; and made through intermediaries (e.g. fellow employees, acting under civil law in ways analogous to privateers acting under Letters of Marque). There isn't necessarily a clear distinction between threats of ostracism, character assassination, boycotts, and firing. I may not care what you think of me, but I care if I can support my family.

Recently, Jason Richwine has joined the ranks of James Watson, Edward O. Wilson, Bjørn Lomborg, and John Derbyshire, people who have had their careers more or less ruined for violating "the prevailing structure of taboos". This has led to some discussion among a variety of other thought criminals on the nature of witch hunts. The most thorough investigation I know of was done by "hbd chick", she of the neglected shift key and the strong stomach. The title of her main post comes from the notorious Malleus Malefacarem: "to disbelieve in witchcraft is the greatest of heresies".

historians who have studied witch-hunts, both religious and political ones, have found that they generally take place during times of turmoil or uncertainty. they are rituals of a sort in which social (and sometimes physical) boundaries are defined — witch-hunts are, at these critical moments, extravagant ways of working out who's in the in-group and who is not. and woe to anyone who is not.

She follows up:

witch-hunts — whether looking for "actual" witches or religious heretics or even political witch-hunts (and, yes, that includes the mccarthy hearings, too) — while they may vary in the particulars, are all fundamentally the same thing: a method of delineating the boundaries between the in-group and the out-group — between what is acceptable behavior and what is not.

Note that this is not just about acceptable and unacceptable behavior. There are double standards. What is acceptable for the out-group is much narrower than what is acceptable for the in-group. Robert Byrd could get away with a lot more than Strom Thurmond could.

The Richwine firing also prompted some discussion on Nick Land's Outside In website. He quoted Peter Brimelow:

Earlier this week, I was talking to a Harvard academic who is familiar with Richwine's work. He commented that there were simply some subjects the study of which is incompatible with an academic career.
"That's a remarkable thing in a free country," I said.
"This isn't a free country," he replied.

In the comments,

Anonymous says:

What strategic sense does it make for heritage to publish a study, let every respectable person imaginable denounce it as racist, and then disavow it? Why do they even publish it in the first place?

admin Reply:

You can't ask those sort of questions about conservatives, it will just drive you insane.

John Derbyshire also remarked on the failure of Richwine's tenured thesis advisors at Harvard to defend the work that they themselves had approved.

The Chinese scholar Sima Qian spoke up for a friend who had earned the wrath of the Emperor. Thus further infuriated, the Emperor ordered Sima Qian to suffer the penalty of castration, and this penalty was carried out.

We live in gentler times, thank goodness. Profs. Borjas, Zeckhauser, and Jencks are in no peril of castration for their offenses against State Ideology. But really, in their cases, what difference would it make?

The real purpose of a witch hunt isn't to persuade people that witches are real, it is to establish social dominance, and the ability to coerce people is a valid proof of social dominance. Killing people and promoting false beliefs and insane public policy are merely collateral damage.
 


 

7. Some propositions about verbal abuse

I don't want to be in the position of saying that only bad people should be able to use effective rhetoric. If there is a tradeoff between civility and effectiveness, I want honest, decent people to be able to sometimes decide in favor of effectiveness.

What are legitimate and illegitimate uses of ridicule and abuse?

1. I suspect that much of the common, garden-variety name-calling on Facebook, for example, is an attempt to disengage from an argument that one is losing, without having to acknowledge that one is losing. Overtly disengaging would suggest that I am losing, so instead, I try to get you to disengage by being abusive. This is an illegitimate use of verbal abuse.

2. Honest attacks on the credibility of witnesses are legitimate. Dishonest ad hominem attacks (character assassination) and attacks on originators or presenters of arguments are illegitimate.

3. There is a distinction between honest uncertainty, where the facts are plausibly deniable, and dishonesty, where they are not. It is legitimate to use ridicule as an emotional stick, to punish someone for implausible deniability. There are two cases here, depending on whether the person at whom my ridicule is directed is the same person I am trying to influence. I may ridicule someone in order to motivate that person to exercise greater mental discipline. The more likely case is that I want to make an example of a few extreme cases for the enlightenment of the others. This is where the "combined arms" or "good cop/bad cop" approach comes in handy. One person can be the heavy, who causes or threatens loss of face, while his partner can be the sweet voice of reason who allows the target to save face.

4. Attempts to flood the discussion space with biased data samples (stuffing the Bayesian "ballot box" with cherry-picked anecdotes) are illegitimate. Attempts to compensate for the other side's stuffing of the Bayesian ballot box with cherry-picked anecdotes by giving counter-examples is legitimate. Unfortunately, the average person can't tell the difference.

5. Creating an illusion of consensus by being obnoxious to infidels is generally illegitimate in a public place (a place where the "infidel" has as much right to be there as anyone else). (This is another interpretation of what goes on on Facebook. People write things to drive infidels out of what they think of as their "sanctuary", but the forum is far more public than they realize.) There may be exceptions if the "infidel" is being disruptive. Ridicule is definitely effective in dispelling an illusion of consensus, but my question is whether it is necessary? This is where Mencken's dead cat enters the picture. The dead cat doesn't just demonstrate that an infidel exists, it also demonstrates that the infidel has confidence in his disbelief. Something like Pascal's wager is going on. The cat hurler demonstrates not merely that he has a relatively low estimate of the probability that the church is backed by God, but that even after multiplying this probability by the consequences of a wrong answer, the product is still safely low.

6. Threats and bribes (ad baculum arguments, or "appeals to force") are generally illegitimate.

7. Trying to delegitimize other people's feelings is illegitimate. If you find yourself doing this, it's probably a sign that you're not really trying to persuade the person whose feelings you're attacking. In many cases, your audience will have little reason other than sentiment for even listening to you with open minds. In any case, you don't have authority over other people's feelings. You can ask people to imagine things from your perspective, but sympathy is not a syllogism.

8. One final point is not to press ahead too quickly after scoring a victory. Be mindful of your audience's feelings. Part of the Scipio Sun Tsu "golden bridge" trick is giving your targets enough time to slink away after a successful engagement in a manner that allows them to save face. As Moldbug says, "Doubt is a slow flower." This is only partly because of the need for time to think through the consequences of a novel idea. Shifting positions without losing face also takes time. Resist the temptation to show off.
 


 

8. How to be a good pariah

How, then, do I go about popularizing novel ideas in the face of a vast information industry devoted to demonizing anyone who tries to take them seriously, supported by millions of people whose social positions depend on maintaining the dominant narrative? You can think of advertised beliefs as shibboleths that indicate which club a person is a member or aspiring member of: the dominant, high-status "Tutsi" club; or some group of impotent "Hutus", whom the Tutsis denounce over every loudspeaker as moral degenerates. How do I convince people to abandon the high-status club and join the low-status club? Should I try to protect my dignity, and take on airs of scientific professionalism, to make my club better able to compete for social status with the people who own the loudspeakers and the legal system? Or do I heave a dead cat into the sanctuary and roister away? Or do I find a partner and play "good cop/bad cop"?

This question is on my mind now in part because of some commenters at the "neoreactionary" Outside In website (namely James Goulding and "Vladimir") who reacted with horror at some clownery at The Radish involving photoshopped "Magic" card game images. How can we be taken seriously by others if we don't take our own dignity seriously?

I take Vladimir's explanation (in the comments here) as representative of Goulding's view. It attempts to distance the Deep Thinkers from the "hordes of newbies" that will give them a bad name. Maintaining a good name is arguably important to get serious people to take your ideas seriously (and perhaps to turn professional and get some grant money?). Here's part of it, but read the whole thing:

The fundamental problem is that the recent explosion of the "neoreactionary" internet meme is a grave danger to the whole loose community of quality thinkers and writers who fall under this moniker. It will attract large crowds of people — too tiny to matter in real life, of course, but still far more numerous than the number of people who can contribute usefully to this discourse — who will be attracted by its symbolism and image, or just by the vain desire to air their uninteresting opinions, and who will drown all interesting and worthwhile discussion. (Witness the gradual decline of Moldbug's comment section from dazzling heights to a hideous cesspool, as his readership expanded and he didn't bother policing it. Or the similar decline of Less Wrong, as hordes of newbies poured in, attracted by the prospect of building a "rationalist" image.)

Furthermore, as a more immediate concern — which is relevant even now, before a serious decline has taken place — we're facing a situation where the "neoreactionary" meme, along with the key terminology that we use as a shorthand for our shared insights (e.g. "the Cathedral"), will acquire a particularly ugly and low-status "nerd porn" connotations, as a result of the newcomers' efforts to express their new-found image in ways they mistakenly believe are cool and witty. "Reactionary" was a wonderful name to co-opt, but the recent trends of this sort (culminating in those "magical hero" cards) have already made it somewhat embarrassing for me to be identified and associated with this word. The progress that we've made in inventing useful theoretical vocabulary will be negated if the use of this vocabulary becomes embarrassing because it will inevitably invoke this sort of image.

My sympathies tend to run with comments by VXXC in a discussion at Foseti's on the neologism, "The Cathedral". He views the probable motivation as status competition internal to the out-group:

Well if you want to do nothing then dump it. Take Land's advice, ignore them and keep "The Cathedral".

If you wish however to bask in the glow of your Intellectual Superiority and keep out the riff raff...and do nothing...why even bother?

Foseti:

It's really more mockery than propaganda. If there's one thing we need, it's more mockery — disdainful mockery.

Vladimir:

Unless you have a credible way of asserting higher status than the target of your mocking, it's extremely hard to pull off disdainful mockery that won't fall flat and make you look like an angry loser or a delusional crackpot. Very few people have the literary talent necessary to achieve this consistently.

Foseti:

I agree. Yet isn't "the Cathedral" a perfect example of what works? It mocks those who think they're above religion, it conveys information about the structure of their beliefs, and it's beautifully concise.

In an unrelated blog post, Nydwracu writes,

A good tactical goal would be to break the association of Universalism and high status. The Cathedral relies on its soft totalitarianism. The less negative incentivization dissent carries, the more dissent will be observable. Crocodilism plays a key role in the maintenance of the appearance of consensus: dissent is associated with both inherent low status and with low-status groups. If you aren't a Universalist, you must be a fedora-wearing MRA, or a bitcoin libertarian, or an inbred neo-Nazi!

I think Nydwracu has this right. The Cathedral's magic is in the control of social status, which they achieve largely through mud-slinging. It seems to me that the worst thing you can do in this situation is to let on that you're allergic to mud. No one who attracts attention to ideas that the Cathedral hates can avoid quickly being thoroughly caked in mud. The easiest way to minimize the damage from this is to laugh it off. Take the Cathedral's power away by letting everyone know that you have a thick skin and a sense of humor, and you don't care about popular opinion (i.e. what the Cathedral says about you).

I also think that Vladimir's point about asserting higher status is wrong. My favorite analogy here is "scorched Earth" tactics. A retreating army doesn't have to have an abundance of food in order to want to destroy the enemy's food supply. Similarly, you don't have to have high social status in order to drag your enemy down to your level. In the case of the Cathedral, this negates their principal strength.

In this situation, where your arguments are not getting a fair hearing because an enemy is abusing his position as a gatekeeper of social status, Mencken was right. Heave a dead cat into The Cathedral and roister away.
 


 

Previous: Honesty, Flattery, or Fealty?
Up
Next: Separated at Birth?

Comments?