Third in a series about doubt, rationalization, being mean while debating, and a problem with the empirical assessment of the effectiveness of arguments.
Before I go on with the main argument, I'd like to go a little meta.
A discussion of the effectiveness of strategies can always be read as strategy advice. If doing X is the most effective way to achieve Y, people who want Y probably should do X. And if Y is something desirable, then it reduces to "you should do X".
Mostly this is the source of our interest. Of course we can care about strategies for totally abstract reasons. For example, the four-color theorem is interesting even to people who never want to color any actual maps. But most of the time I think we care about strategies because we care about results. Taking the theme of this series specifically, I care about what kind of debating strategies annoy their targets, because I want to have fights with people I like but disagree with without putting them of. By comparison, I don't care about horses, so I don't think I have any worthwhile thoughts on effective strategies for horse-training. So we care about strategies because we care about results and that is fine.
But I think it can get more problematic when we start talking about strategies for other people's goals. We often have very strong preferences on other people's behavior, so the question of what they could do to achieve their goals can easily get entangled with the question what we want them to do. And humans suck at reasoning against interest.
For example, some people want the Catholic church to become more like modern protestant churches and some people want it to undo the last council. Both sides sell their respective policy not only as the right thing to do but also as the best way to reach people in these modern times. Similarly, now that the American Republicans have lost the last election, adherents of that party's different wings are talking on how it could regain popularity by ditching their respective inner-party enemies. It's funny how often the optimal strategy coincides with what the advisor wants the advisee to do anyway.
So basically I think our thinking is much less likely to be clouded by politics if it stays away from advice to people very different from us. In some cases this can be achieved by simply avoiding that kind of question. For example, the usual "if you really wanted X you would Y" argument has an extremely high Sturgeon Number. But sometimes we care about a strategy both for ourselves and for other people. In that case I don't have a solution, but I can offer some heuristics.
One is to look at symmetries. If my opponents should do something, there probably is something equivalent I should be doing and a self-serving bias might be reversed and more noticeable in that equivalent. For example, I tend to get fairly angry about the atheist canard of belief-in-belief. Not only is it a self-serving excuse to dismiss disagreeing people, it also sets up a classical "you can't prove you're not a witch" situation. That way the dumbest kind of atheist can stay sure nobody could disagree with them for intelligent reasons. But on the flip side, this means it's probably not all that winsome to explain to atheists how they are motivated by pride or angriness at God or whatever. Of course I think I'm right and that means people who disagree with me while having the same information must be irrational at some level. Consistently they must think the same about me. But engaging opponents's suspected irrational motives rather than their arguments will rarely be useful. For one internal motives are almost always much more complex than the average kitchen table psychology. But more importantly, people get angry and stubborn when they are engaged as psycho-fixing objects. Also there is the off chance it's actually me being irrational. So it's best to appeal to their rationality, so that it can ultimately overcome whatever is now in its way.
I think this is a special case of a more generally useful strategy: Where there is a major social polarization it is often useful to look for isomorphisms between both sides and then use them to hunt for inconsistencies. More provocatively, if I need to annoy one side, it's usually worthwhile to annoy both. So, for example, neither kind of sex-ed (abstinence only or comprehensive) shows any signs of effectiveness. Also creationists=Jesus Mythers, Ann Coulter=Michael Moore, Whiggism=the good old days, etc.
Another approach is simply to trying to be aware of our biases. For example, I'm trying to think about good proselytizing strategies for myself and using atheist proselytizing strategies as a mirror, because they are what I know from the other side. But the truth is, I would also like atheists to be more friendly so fighting with them would be more fun, so there is a potential bias there. I can't totally avoid it, but often being aware of a problem is the most difficult part.
Lastly, it might help to remember how little influence we have on other people. Most people just aren't waiting for strategy advice from their adversaries. In my particular case, the log files indicate this blog has a grand total of about twenty more or less regular readers, all of whom probably already have strong opinions on proselytizing strategies. So the chance of convincing anyone of anything here is basically zero.
So I don't expect this series to convert anyone to my strategical approach. On the other hand, given that I know that, I think I'm relatively safe from the bias I've been discussing in this post.
Next in this series: What I originally promised for now, i.e. a much more effective atheist argumentation style (but it caught me only after its main window of opportunity had passed.) (And also it interest me mainly as a mirror image of a Christian one.)