Martin Cox is the Director of the John Locke Institute.
Facebook decided to ban one of our ads this week. They didn't tell us why.
The offending ad was an invitation to visit our new blog to read an article by Daniel Hannan, on how the pandemic will affect our society in the long run. One of the three or four most influential Brexit campaigners, Lord Hannan is not universally admired. And there is plenty to take issue with in the gloomy prognosis of his article. But neither of these provides a reason for Facebook to ban it.
We launched our blog exactly seven days ago, and I reflected that of the first few articles we posted I disagree with about half of them. Ethan Christian Tan, a soldier in Singapore, writes in support of Ethical Intuitionism, which I think is a fundamental error; his article was judged to be the best out of nearly three thousand entries in our global essay competition last year, though. Nayah Victoria Thu wrote in praise of social cohesion, a concept of which I am deeply suspicious. It is too often used as a cover to try to impose cultural conformity; rather than an attempt to promote cohesion, I prefer peaceful tolerance of differences. Lord Hannan himself I think is too pessimistic about the fragility of the liberal order, and he was too optimistic about the kind of post-Brexit politics he expected to see; early indications certainly seem to be against him. Hannan is happy to call himself a nationalist (in that he prizes Westphalian sovereignty); I'm just about the least nationalist person I know.
"The key to generous listening is to attend to the best arguments of the other side."
So why would I publish people with whom I disagree?
When the Brexit referendum result was announced, an Oxford friend of mine posted, 'If you voted for Brexit, we're not friends anymore.' I reflected that, among my own Facebook friends, I had a pretty even split of Leavers and Remainers; nothing about the referendum made me want to cut off relations with a thousand people. When Dr Stephen Davies wrote a piece for the John Locke Institute about immigration restrictions, a long-time friend wrote that he would pass up the chance to read an article by a person who supported certain policies he deeply disliked. (They have since made up, I'm happy to report.)
On the other side, I was proud that Louie, a friend and alumnus, encouraged people to read our blog, saying, I don't agree with many things on it, but it's all worth reading. Sophie, our intern who curates the blog, told me that every week she forces herself to read articles from a wide range of perspectives, some of which she finds uncongenial to the point of discomfort.
There are three reasons to publish people whom I think are wrong.
To call any proposition certain, while there is any one who would deny its certainty if permitted, but who is not permitted, is to assume that we ourselves, and those who agree with us, are the judges of certainty, and judges without hearing the other side.
– John Stuart Mill
First, I want to model intellectual humility for my students. The key to generous listening is to attend to the best arguments of the other side, to expect to learn something from them, to hope - occasionally, at least - to find a reason to change your mind. Attribute the best possible motives to your adversaries. 'It seems to me that the worst mistake a fighter for our ideals can make,' wrote Friedrich von Hayek, 'is to ascribe to our opponents dishonest or immoral aims.'
Second, how will I learn anything if I surrender to the temptation to seek out only those opinions with which I already agree? Being forced to change one's mind, by better evidence or stronger arguments, is intellectually bracing, like a splash of cold water to the face. By instinct we try everything we can to resist it, but in retrospect we see it is a great gift. Changing your mind from time to time is not evidence that you are not strong-minded. Just the opposite: it is evidence that your mind is alive, not an intellectual corpse.
Third, I may be wrong. Even if I'm almost certain that any given belief is correct, if I hold enough opinions - and I'm sure I do - it is almost a mathematical certainty that I'm wrong about some of them. How do I know that this particular belief is not one of them?