Don't Think of a Ticking Timebomb
Andrew Gelman over at Overcoming Bias proposes that there is a "one-sided bet" fallacy. He describes it as "see[ing] only half the problem" and "not realiz[ing] that there are any tradeoffs at all."
To get my nitpicking out of the way: his description isn't quite accurate as he applies it to many situations where the person has simply underestimated (or actually MISestimated - not necessarily just "misunderestimated") the real tradeoff rather than being completely unaware that there is a tradeoff. On top of that, I'm not entirely comfortable calling this a "fallacy," probably due to the scalar nature of its application. That is, people can be more or less guilty of this (corresponding to the extent to which they misjudge the actual tradeoff they're making), and so it seems like a data-gathering problem more than a logical problem. And in those examples where it IS an absolute failure to recognize the existence of a tradeoff, then isn't it simply a normal false dichotomy?
But never mind - I think "sure bet fallacy" (if I may restyle it such) is a useful term all the same - because it certainly is true that people frequently misjudge the tradeoff they are making when they make decisions when they really ought to know better. One of Dr. Gelman's examples is particularly frustrating for me in this age of rampant smoking bans. He talks about asking students whether they will accept a large sum of money in exchange for the one-time one-in-a-billion risk of instant death. Of course you should take the wager - because you FREQUENTLY accept the same wager for much smaller sums of money (you might go running across the street because you left your wallet, which contains credit cards with limits much less than the hypothetical sum of money, and in so doing you incur a MUCH greater than one-in-a-billion chance of instant death in a traffic accident, for example. And to add injury to insult here, you're not even guaranteed to lose your money to theft if you simply wait for the light to change to cross, etc.). People get really irrational when someone tells them that something could kill them. IN fact, most things can kill you if applied properly, and, in the immortal words of Frank Drebbin "We all take risks. You take a risk just getting up in the morning, crossing the street, or sticking your face in a fan."
I would like, however, to take issue with this one of Dr. Gelman's examples:
Torture and the ticking time bomb: the argument that it's morally defensible (maybe even imperative) to torture a prisoner if this will yield even a small probability of finding where the ticking (H)-bomb is that will obliterate a large city. Again, this ignores the other side of the decision tree: the probability that, by torturing someone, you will motivate someone else to blow up your city.
Hmmm.... seems like Dr. Gelman is falling victim to his own fallacy. That is, he's here insisting that this decision be approached as a tradeoff when for all practical purposes it probably isn't. I mean, right, we can get nitpicky and acknowledge that there is a real risk that there's actually someone alive in the world who would resort to bombing another one of your cities just because you tortured someone to save the city in question from certain doom - but I think any detached observer will admit that this is SO unlikely and implausible as to be hardly worth our time considering. After all, anyone who will resort to bombing a city is probably already in the situation where he hardly needs this single incident of torture to motivate him. If torturing this man to save your city from certain doom will set him off, then as likely as not you'll do something else next week without even knowing it that will set him off as well. He's either not a rational agent, or he's already your sworn and deadly enemy (or, mostly likely, both) - and in either case it seems futile to worry too much about how your actions will affect his decisions as he's coming after you in the end anyway.
This particular application of the "sure bet fallacy" strikes me as particularly dangerous, and particularly rampant among anti-war leftists, in political discourse today - so much so that it's worth countering it.
Jonathan Rauch describes himself as "anti-anti-war" in an essay apologizing for his pro-war stance - an essay that I found quite a good read simply because it spoke to some of my own feelings on the subject. I was more vocal in my support of the Iraq War than Rauch (and I "technically" still support it, where he has recanted) - but I have also admitted that a lot of my motivation back in 2003 was "anti-anti-war" sentiment. And yes, fine, that's an emotional and irrational motivation. But the point is it's there for a lot of us. The anti-war crowd is so maddening that it's sometimes really hard to say things that please them, even if those things are right. And the main reason, in my opinion, that the anti-war crowd is maddening is because they make EXACTLY the mistake that Dr. Gelman is making here. They overestimate the extent to which things we do form the basis of a terrorist's motivation. Maddeningly, they also seem to be in a position to believe that people who blow up innocents are in a position to make moral calculations in the same way that we do. That is, they tend to assume that bad things we do will result in further attacks, but they neglect the possibility that many of the good things we do (such as treating women as equals or allowing for legal sodomy) will form an equal or even more powerful motivation for terrorists. And so they form their opinions about our policy based on seriously skewed tradeoff parameters.
While we're making up logical fallacies, I would like to call this the "Star Trek Fallacy." Or maybe the "Picard Fallacy," or the "Roddenberry Fallacy." And I would like to further be so bold as to say it's the product of a weak moral sense.
Star Trek - ESPECIALLY Next Generation Star Trek (which I despise largely for exactly this reason) - is replete with examples where doing "the right thing" turns out to be politically expedient in some grander scheme of things. We get a couple of scenes of obligatory moral agonizing, but we the viewers know that in the end things will work out. Doing the right thing will end up being proof to some megapowerful entity that humanity is worth being given a shot at continued existence after all, and so it turns out in the 47th minute that the REAL choice we were facing wasn't whether or not to torture our captured terrorist and in so doing save a city, but rather whether or not to torture a terrorist and in NOT so doing ensure the continued survival of our species (???).
It's maddening how often a show that purports to be atheist pulls this trick - which really is the equivalent of what you were taught in Sunday school. You can lie and net a minor short-term benefit, or you can face GOD'S WRATH! Well, as Immanuel Kant points out, this isn't a moral choice at all. Moral choices are made to preserve the rule that demands them - but traditional Christian moral choices are very much made on the basis of a reward system - i.e. aren't moral choices at all.
Someone will point out, of course, that Picard/Kirk don't know that there are all-powerful space entities watching them when they make the choices they make. But damnit after 50+ episodes of this they bloody well OUGHT to know, right? The point is that it gives the audience the same sort of goofy (misnamed) "moral" sense that believeing in God does: the illusion that every action is being scrutinized by someone with a psychotic sense of reward and punishment. Even if Picard and Kirk aren't allowed - for purely dramatic purposes, you understand - to get used to this schtick, we the audience do (and some of us, in fact, get REALLY sick of it). And the effect of all this is to allow the writers to justify what really is a sense of skewed priorities on their part. I.e. that torture is always bad, or that carrying a gun only leads to trouble, or whatever else the Hollywood cause du jour is.
Now my problem is in trying to describe this concisely! Here's a stab:
The Star Trek Fallacy is the assumption that all moral prohibitions are absolutes - the elevation in status of a rational constraint to religious taboo, complete with (implied) mystical reward and punishment system.
I suppose Dr. Gelman, if he ever were to read this, might object that this is but a particular instance of the "sure bet fallacy" since it, too, involves an essentially skewed look at the actual tradeoff being made. But I think there's another dimension to this one; the difference is that people are operating as though there were a clear and one-sided tradeoff (an "offer they can't refuse") but telling themselves that their motivations are pure and moral. Or, more accurately, they have imagined a tradeoff more extreme than the one that actually exists for the purpose of flattering their moral sense. They want to have an absolute moral prohibition, but they can't quite bring themselves to it, so they concoct an implied punishment and act as though that were real (or at least, as though it were more of a threat than it actually is).
And alright, so this is even less of a "fallacy" than Dr. Gelman's "sure bet fallacy." I never claimed that this was a real logical fallacy - just that it's an annoying and frequent error that you have to face when arguing with irrational people.
Which brings me back to the anti-war crowd. Although I, unlike Rauch, consider myself a supporter of the War (even now), like Rauch I'm probably more "anti-anti-war" than I am "pro-war." After all, my ideal foreign policy is Ron Paul's - a policy of complete withdrawal from meddling in the world. Meaning: no wars, no foreign aid - only trade and extradition treaties on a case-by-case basis. My support for the war is very much a "given the circumstances" kind of thing.
So I get really tired of people who base their political opinions on fantasy - and I think a lot of the hand-wringing about torture is exactly that. It's the best example of the Star Trek Fallacy anyone is likely to see. People talk as though their opposition to torture is based on a deep moral concern, but in the very next sentence they give away the farm and tell you that torturing people will only embolden terrorists and recruit more disaffected Arab youth to the cause. Meaning really it's based on an irrational fear and they're just styling it as "moral." If this were stated rationally, of course, it wouldn't be a "fallacy" (in our new, looser use of the term, of course) at all. It would be an ultimately empirical argument that we could resolve by taking an honest look at the situation and evaluating our chances. But since the way in which they're stating their position is dishonest, we can't ever have that discussion. Because the "tradeoff" is really just an excuse to allow them to morally simplify. What's at stake, ultimately, is their need to feel "civilized." They think that's what "civilized" people do, and this is a club they want to be in.
It's much the same with arguments for gun control. The most common one you hear is that "guns are bad." As though that were the whole of the situation. Guns are tools. True - their purpose is to kill or injure in the same way that a hammer's purpose is to drive in a nail, and since we dislike killing and injuring we tend to think that guns, which have this as their purpose, are inherently bad. But this is a gross oversimplification. Sometimes killing and injuring result in something better than not doing so - specifically, they prevent unjust killing and injuring. It's a nice fantasy to think that you're the kind of person who has no killing or injuring in his life, but that's not the real world. In the real world, we're often in situations where killing and injuring are the least-bad options. (You don't even have to think of a criminal attack here. Imagine a pack of dogs attacking one of your children - if, say, you live in the UP.) People who like to think of themselves as exemplars of "civilized behavior" are of course "against injury," but then aren't we all? So they elevate the prohibition on guns to an absolute so that they can feel "civilized," even though it's based on an improper reading of the situation. The telltale sign that we're in the presence of the Star Trek Fallacy are the frequent claims that "people who try to defend themselves with guns are more likely to have their own gun used against them." There's that practical consequence of "wrong thinking" again. The implication is that "if you sully yourself by owning a gun, the unnamed entity that watches everything we do and punishes the uncivilized will see to it that you are attacked and that your gun makes it worse than it would otherwise have been."
The truth about torture is this: it's COMPLETELY justified to save a city from a certain bombing. I would say that it is, in fact, the "civilized" thing to do in this case. There is nothing "civilized" about refraining from torture and so letting hundreds of thousands die. If torture seems to be the only way to save their lives, and you are reasonably sure that the person before you is guilty of (or at least complicit in) trying to murder hundreds of thousands of innocents, then it is truly perverse to get all morally squeamish about torture. Just as I believe it is perverse NOT to shoot someone who breaks into your house in the middle of the night. You may safely presume, at that point, that he intends to do you harm. Shoot the fucker. He started it after all.
As for the supposed "tradeoff" - i.e. that torturing this guy (and saving the city) will only result in another city being bombed - PLEASE! Surely this is Dr. Gelman's Star Trek Fallacy imagination at work. If we resort to torture, the cosmic entity that watches us to make sure we're civilized will see to it that something terrible results. But something terrible is already going to happen if we DON'T torture. And of course, in reality, the supposition that terrorists are all that motivated by our decision (or not) to torture probably doesn't hold up. We spent decades not torturing terrorists before we decided to start doing (a pussified version of) it, and yet they still attacked us. So fine - technically I suppose it's possible that there's someone out there who would actually refrain from ever attacking us again if we only just don't torture and let this one pet bombing go through. But in the real world, that risk is so negligable as to be something we can practically ignore. For all practical purposes, the "sure bet fallacy" in this situation isn't a fallacy at all. If there's a ticking timebomb and a man in front of you who will tell you how to disarm it if your torture him, and thousands of lives really do hang in the balance - the for God's Sake (heh) torture the guy!
But alright - to be fair, Dr. Gelman worded his hypothetical in a different way than we're used to seeing it. He adds in the "even a small probability." That's not something you normally see in this scenario. Normally the "ticking bomb" scenario is that torturing the man in front of you is more or less guaranteed to net you the information you need - and the finesse comes in at the end when the questioner, having gotten you to agree that torture under these circumstances would be fine (or even imperative, as I personally believe), then asks what your certainty threshold is. That is, how "certain" do you have to be that torturing the man in front of you will get you what you want? And that's the rub, of course - we'll all have different threshholds, and there's virtually guaranteed to be someone in the government whose threshhold is lower than yours - i.e. who is willing to torture for less than you are. And so the point is generally that you should oppose torture ATB rather than run the risk of having given your stamp of approval to someone who will abuse the permission you the voter "gave" them to torture in your name. And this is indeed something worth thinking about. Because the lower the threshhold of certainty, the more, I suppose, we run the risk of inspiring people to further bomb as a result of our actions.
Nevertheless, I think the real tradeoff in such situations is a moral one and not one of military consequences. The real tradeoff is one of whether we're torturing innocent people - something we avoid not because it will lead to further bombings, but because we value justice. Because even with Dr. Gelman's revised version of the normal way this conundrum is pitched, the risk of collateral damage from torture is probably pretty low. Unless torture by our military is rampant and completely random, it seems unlikely to inspire anyone to plant bombs who wouldn't have already done so for independent reasons. Sure, they may cite it as justification - but that doesn't mean we need to take them at their word.
And so I think this case is misstated. I agree that "sure bet fallacy" can be a useful (if informal) term in debate - but I do not think it really applies to this situation. Unless the person is advocating systematic torture (something that no one, as far as I am aware, is even hinting at advocating), then the tradeoff Dr. Gelman is talking about is so negligable as to be something that we can ignore for the purposes of practical discussion.
Torture is morally permissible in situations where it is applied to someone who is likely guilty and its application will result in saving innocent lives. This point cannot be rationally disputed. The question is how "costly" we want to make torture so as to avoid mistakes. I agree with Noah (pc) who wants to make the cost absolute in legal terms. We can forbid torture across the board and trust in human nature. That is, we trust that anyone who actually finds himself in a classic "ticking bomb" scenario will be willing to break the law and risk the penalty to save thousands of innocents. We then put our trust in an executive pardon to override the legal consequences. This is, after all, precisely the kind of situation that executive pardons were intended to cover. So: a full legal ban on torture, but with the moral understanding that it is sometimes necessary, albeit only in extreme circumstances. The legal ban just enforces that we only use it as a last resort (a notion that probably cannot be coded legally to everyone's satisfaction and/or without some pretty nasty unintended consequences).
Now - in the interest of fairness I will admit that I may have grossly misrepresented things. I'm going, after all, off of a three-line blog entry from someone I don't know rather than an indepth conversation with someone I do. But Dr. Gelman's speculation about torture smacks of Star Trek Fallacy to me. We've been given an excuse based on a gross exaggeration of consequences to elevate something to the status of an absolute prohibition that isn't actually, nor should it be.
2 Comments:
ninest123 12.31
tory burch outlet, ugg boots, louis vuitton outlet, burberry outlet online, louboutin, jordan shoes, oakley sunglasses, michael kors outlet, burberry, polo ralph lauren outlet, christian louboutin outlet, michael kors outlet, louis vuitton, louboutin shoes, michael kors, polo ralph lauren outlet, chanel handbags, oakley sunglasses, prada handbags, ray ban sunglasses, oakley sunglasses, gucci outlet, longchamp outlet, uggs on sale, louboutin outlet, michael kors outlet, nike outlet, ugg boots, louis vuitton, nike free, tiffany and co, louis vuitton outlet, ugg boots, michael kors outlet, replica watches, longchamp, prada outlet, replica watches, nike air max, ray ban sunglasses, cheap oakley sunglasses, longchamp outlet, ugg boots, louis vuitton, nike air max, tiffany jewelry, ray ban sunglasses, oakley sunglasses, michael kors outlet
louis vuitton, canada goose, karen millen, pandora charms, moncler, converse outlet, pandora jewelry, links of london, canada goose, wedding dresses, lancel, moncler, canada goose uk, supra shoes, replica watches, moncler outlet, hollister, toms shoes, doke gabbana outlet, canada goose, doudoune canada goose, coach outlet, canada goose outlet, moncler, ugg boots uk, canada goose, louis vuitton, ugg pas cher, swarovski, barbour, pandora jewelry, marc jacobs, moncler, bottes ugg, louis vuitton, moncler, louis vuitton, sac louis vuitton pas cher, canada goose outlet, montre pas cher, barbour jackets, pandora charms, juicy couture outlet, moncler, juicy couture outlet, ugg,uggs,uggs canada, thomas sabo, swarovski crystal, moncler, ugg,ugg australia,ugg italia
nienst123 12.31
Post a Comment
<< Home