In a discussion of the utility of religious beliefs, Norm makes this claim:
A person can’t intelligibly say, ‘I know that p is false, but it’s useful for me to think it’s true, so I will.’ “
(Here, p is some proposition – that is, some statement about the world which may be either true or false, but not both and not neither.)
In fact, a person can indeed intelligibly say this, and pure mathematicians do it all the time. Perhaps the example in mathematics which is easiest to grasp is the use of the square root of minus one, the number usually denoted by the symbol i. Negative numbers cannot have square roots, since there are no numbers which when squared (multiplied by themselves) lead to a negative number. However, it turns out that believing that these imaginary numbers do exist leads to a beautiful and subtle mathematical theory, called the theory of complex numbers. This theory has multiple practical applications, from mathematics to physics to engineering. One area of application we have known for about a century is the theory of alternating current in electricity; blogging – among much else of modern life – would perhaps be impossible, or at least very different, without this belief in imaginary entities underpinning the theory of electricity.
And, as I have argued before (eg, here and here), effective business strategy development and planning under uncertainty requires holding multiple incoherent beliefs about the world simultaneously. The scenarios created by scenario planners are examples of such mutually inconsistent beliefs about the world. Most people – and most companies – find it difficult to maintain and act upon mutually-inconsistent beliefs. For that reason the company that pioneered the use of scenario planning, Shell, has always tried to ensure that probabilities are never assigned to scenarios, because managers tend to give greater credence and hence attention to scenarios having higher-probabilities. The utilitarian value of scenario planning is greatest when planners consider seriously the consequences of low-likelihood, high-impact scenarios (as Shell found after the OPEC oil price in 1973), not the scenarios they think are most probable. To do this well, planners need to believe statements that they judge to be false, or at least act as if they believe these statements.
Here and here I discuss another example, taken from espionage history.