iPhobia

Commuting days until retirement: 238

If you have ever spoken at any length to someone who is suffering with a diagnosed mental illness − depression, say, or obsessive compulsive disorder − you may have come to feel that what they are experiencing differs only in degree from your own mental life, rather than being something fundamentally different (assuming, of course, that you are lucky enough not to have been similarly ill yourself). It’s as if mental illness, for the most part, is not something entirely alien to the ‘normal’ life of the mind, but just a distortion of it. Rather than the presence of a new unwelcome intruder, it’s more that the familiar elements of mental functioning have lost their usual proportion to one another. If you spoke to someone who was suffering from paranoid feelings of persecution, you might just feel an echo of them in the back of your own mind: those faint impulses that are immediately squashed by the power of your ability to draw logical common-sense conclusions from what you see about you. Or perhaps you might encounter someone who compulsively and repeatedly checks that they are safe from intrusion; but we all sometimes experience that need to reassure ourselves that a door is locked, when we know perfectly well that it is really.

That uncomfortably close affinity between true mental illness and everyday neurotic tics is nowhere more obvious than with phobias. A phobia serious enough to be clinically significant can make it impossible for the sufferer to cope with everyday situations; while on the other hand nearly every family has a member (usually female, but not always) who can’t go near the bath with a spider in it, as well as a member (usually male, but not always) who nonchalantly picks the creature up and ejects it from the house. (I remember that my own parents went against these sexual stereotypes.) But the phobias I want to focus on here are those two familiar opposites − claustrophobia and agoraphobia.

We are all phobics

In some degree, virtually all of us suffer from them, and perfectly rationally so. Anyone would fear, say, being buried alive, or, at the other extreme, being launched into some limitless space without hand or foothold, or any point of reference. And between the extremes, most of us have some degree of bias one way or the other. Especially so − and this is the central point of my post − in an intellectual sense. I want to suggest that there is such a phenomenon as an intellectual phobia: let’s call it an iphobia. My meaning is not, as the Urban Dictionary would have it, an extreme hatred of Apple products, or a morbid fear of breaking your iPhone. Rather, I want to suggest that there are two species of thinkers: iagorophobes and iclaustrophobes, if you’ll allow me such ugly words.

A typical iagorophobe will in most cases cleave to scientific orthodoxy. Not for her the wide open spaces of uncontrolled, rudderless, speculative thinking. She’s reassured by a rigid theoretical framework, comforted by predictability; any unexplained phenomenon demands to be brought into the fold of existing theory, for any other way, it seems to her, lies madness. But for the iclaustrophobe, on the other hand, it’s intolerable to be caged inside that inflexible framework. Telepathy? Precognition? Significant coincidence? Of course they exist; there is ample anecdotal evidence. If scientific orthodoxy can’t embrace them, then so much the worse for it − the incompatibility merely reflects our ignorance. To this the iagorophobe would retort that we have no logical grounds whatever for such beliefs. If we have nothing but anecdotal evidence, we have no predictability; and phenomena that can’t be predicted can’t therefore be falsified, so any such beliefs fall foul of the Popperian criterion of scientific validity. But why, asks the iclaustrophobe, do we have to be constrained by some arbitrary set of rules? These things are out there − they happen. Deal with it. And so the debate goes.

Archetypal iPhobics

Widening the arena more than somewhat, perhaps the archetypal iclaustrophobe was Plato. For him, the notion that what we see was all we would ever get was anathema – and he eloquently expressed his iclaustrophobic response to it in his parable of the cave. For him true reality was immeasurably greater than the world of our everyday existence. And of course he is often contrasted with his pupil Aristotle, for whom what we can see is, in itself, an inexhaustibly fascinating guide to the nature of our world − no further reality need be posited. And Aristotle, of course, is the progenitor of the syllogism and deductive logic. In Raphael’s famous fresco The School of Athens, the relevant detail of which you see below, Plato, on the left, indicates his world of forms beyond our immediate reality by pointing heavenward, while Aristotle’s gesture emphasises the earth, and the here and now. Raphael has them exchanging disputatious glances, which for me express the hostility that exists between the opposed iphobic world-views to this day.

School of Athens

Detail from Raphael’s School of Athens in the Vatican, Rome (Wikimedia Commons)

iPhobia today

It’s not surprising that there is such hostility; I want to suggest that we are talking not of a mere intellectual disagreement, but a situation where each side insists on a reality to which the other has a strong (i)phobic reaction. Let’s look at a specific present-day example, from within the WordPress forums. There’s a blog called Why Evolution is True, which I’d recommend as a good read. It’s written by Jerry Coyne, a distinguished American professor of biology. His title is obviously aimed principally at the flourishing belief in creationism which exists in the US − Coyne has extensively criticised the so-called Intelligent Design theory. (In in my view, that controversy is not a dispute between the two iphobias I have described, but between two forms of iagoraphobia. The creationists, I would contend, are locked up in an intellectual ghetto of their own making, since venturing outside it would fatally threaten their grip on their frenziedly held, narrowly based faith.)

Jerry Coyne

Jerry Coyne (Zooterkin/Wikimedia Commons)

But I want to focus on another issue highlighted in the blog, which in this case is a conflict between the two phobias. A year or so ago Coyne took issue with the fact that the maverick scientist Rupert Sheldrake was given a platform to explain his ideas in the TED forum. Note Coyne’s use of the hate word ‘woo’, often used by the orthodox in science as an insulting reference to the unorthodox. They would defend it, mostly with justification, as characterising what is mystical or wildly speculative, and without evidential basis − but I’d claim there’s more to it than that: it’s also the iagorophobe’s cry of revulsion.

Rupert Sheldrake

Rupert Sheldrake (Zereshk/Wikimedia Commons)

Coyne has strongly attacked Sheldrake on more than one occasion: is there anything that can be said in Sheldrake’s defence? As a scientist he has an impeccable pedigree, having a Cambridge doctorate and fellowship in biology. It seems that he developed his unorthodox ideas early on in his career, central among which is his notion of ‘morphic resonance’, whereby animal and human behaviour, and much else besides, is influenced by previous similar behaviour. It’s an idea that I’ve always found interesting to speculate about − but it’s obviously also a red rag to the iagorophobic bull. We can also mention that he has been careful to describe how his theories can be experimentally confirmed or falsified, thus claiming scientific status for them. He also invokes his ideas to explain aspects of the formation of organisms that, in to date, haven’t been explained by the action of DNA. But increasing knowledge of the significance of what was formerly thought of as ‘junk DNA’ is going a long way to filling these explanatory gaps, so Sheldrake’s position looks particularly weak here. And in his TED talks he not only defends his own ideas, but attacks many of the accepted tenets of current scientific theory.

However, I’d like to return to the debate over whether Sheldrake should be denied his TED platform. Coyne’s comments led to a reconsideration of the matter by the TED editors, who opened a public forum for discussion on the matter. The ultimate, not unreasonable, decision was that the talks were kept available, but separately from the mainstream content. Coyne said he was surprised by the level of invective arising from the discussion; but I’d say this is because we have here a direct confrontation between iclaustrophobes and iagorophobes − not merely a polite debate, but a forum where each side taunts the other with notions for which the opponents have a visceral revulsion. And it has always been so; for me the iphobia concept explains the rampant hostility which always characterises debates of this type − as if the participants are not merely facing opposed ideas, but respective visions which invoke in each a deeply rooted fear.

I should say at this point that I don’t claim any godlike objectivity in this matter; I’m happy to come out of the closet as an iclaustrophobe myself. This doesn’t mean in my case that I take on board any amount of New Age mumbo-jumbo; I try to exercise rational scepticism where it’s called for. But as an example, let’s go back to Sheldrake: he’s written a book about the observation that housebound dogs sometimes appear to show marked  excitement at the moment that their distant owner sets off to return home, although there’s no way they could have knowledge of the owner’s actions at that moment. I have no idea whether there’s anything in this − but the fact is that if it were shown to be true nothing would give me greater pleasure. I love mystery and inexplicable facts, and for me they make the world a more intriguing and stimulating place. But of course Coyne isn’t the only commentator who has dismissed the theory out of hand as intolerable woo. I don’t expect this matter to be settled in the foreseeable future, if only because it would be career suicide for any mainstream scientist to investigate it.

Science and iPhobia

Why should such a course of action be so damaging to an investigator? Let’s start by putting the argument that it’s a desirable state of affairs that such research should be eschewed by the mainstream. The success of the scientific enterprise is largely due to the rigorous methodology it has developed; progress has resulted from successive, well-founded steps of theorising and experimental testing. If scientists were to spend their time investigating every wild theory that was proposed their efforts would become undirected and diffuse, and progress would be stalled. I can see the sense in this, and any self-respecting iagorophobe would endorse it. But against this, we can argue that progress in science often results from bold, unexpected ideas that come out of the blue (some examples in a moment). While this more restrictive outlook lends coherence to the scientific agenda, it can, just occasionally, exclude valuable insights. To explain why the restrictive approach holds sway I would look at the how a person’s psychological make-up might influence their career choice. Most iagorophobes are likely to be attracted to the logical, internally consistent framework they would be working with as part of a scientific career; while those of an iclaustrophobic profile might be attracted in an artistic direction. Hence science’s inbuilt resistance to out-of-the-blue ideas.

Albert Einstein

Albert Einstein (Wikimedia Commons)

I may come from the iclaustrophobe camp, but I don’t want to claim that only people of that profile are responsible for great scientific innovations. Take Einstein, who may have had an early fantasy of riding on a light beam, but it was one which led him through rigorous mathematical steps to a vastly coherent and revolutionary conception. His essential iagorophbia is seen in his revulsion from the notion of quantum indeterminacy − his ‘God does not play dice’. Relativity, despite being wholly novel in its time, is often spoken of as a ‘classical’ theory, in the sense that it retains the mathematical precision and predictability of the Newtonian schema which preceded it.

Niels Bohr

Niels Bohr (Wikimedia Commons)

There was a long-standing debate between him and Niels Bohr, the progenitor of the so-called Copenhagen interpretation of quantum theory, which held that different sub-atomic scenarios coexisted in ‘superposition’ until an observation was made and the wave function collapsed. Bohr, it seems to me, with his willingness to entertain wildly counter-intuitive ideas, was a good example of an iclaustrophobe; so it’s hardly surprising that the debate between him and Einstein was so irreconcilable − although it’s to the credit of both that their mutual respect never faltered..

Over to you

Are you an iclaustrophobe or an iagorophobe? A Plato or an Aristotle? A Sheldrake or a Coyne? A Bohr or an Einstein? Or perhaps not particularly either? I’d welcome comments from either side, or neither.

Science and Emotion

Commuting days until retirement: 507

heretics-coverFollowing my comments 3 posts ago, my reading on the train over the last week has been The Heretics: Adventures with the Enemies of Science, by Will Storr. Despite booming bishops and other distractions, I found it intensely readable, and I think it pretty much fulfilled my expectations.

The subtitle about ‘the Enemies of Science’ is perhaps a little misleading: this is not primarily an exercise in bashing such people – you’ll find plenty of other books that do that, and any number of internet forums (and of course I had a go at it myself, in the post I mentioned). It’s an attempt to understand them, and to investigate why they believe what they do. Storr does not treat the enemies of science as being necessarily his personal enemies, and it emerges at the same time that the friends of science are not always particularly – well, friendly.

I was going to say that it’s a strength of the book that he maintains this approach without compromising his own adherence to rationality, but that’s not strictly true. Because another strength is that he doesn’t attempt to adopt a wholly godlike, objective view. Rather, he presents himself, warts and all, as a creature like the rest of us, who has had to fight his own emotional demons in order to arrive at some sort of objectivity. And he does admit to experiencing a kind of bewitchment when talking to people with far-out beliefs. ‘They are magic-makers. And, beneath all that a private undercurrent: I feel a kind of kinship with them. I am drawn to the wrong.’

It’s a subtle approach, and a difficult path to tread, which invites misunderstanding. And one critic who I believe misunderstands is Mark Henderson in the Guardian who, while admiring aspects of Storr’s work, finds the book ‘disappointing and infuriating….  He is like the child who still wants to believe in Father Christmas, but who is just old enough to know better. Life would be more magical, more fun, if the story were true.’  Well here I think Henderson is unwittingly stating the central thesis of Storr’s book: that as humans we are above all myth makers – we have a need to see ourselves as a hero of our own preferred narrative.

This idea appeals to me in particular because it chimes in with ideas that I have come across in an evening course I am currently following in the philosophy of emotion. Writers on emotion quite naturally classify emotion into positive (happiness, love, hope, excitement) and negative (sadness, hatred, anger, fear, etc). The naive assumption is that we welcome the former and avoid the latter if we can. But of course the reality is far more nuanced, and more interesting than that. In the first place is the need of many to pursue dangerous and frightening pursuits, and then of course the undoubted delights of sado-masochism. But much closer to home is the fact that we flock to horror films and emotionally harrowing dramas – we love to be vicariously frightened or distressed. Narrative is our stock in trade, and (as the increasingly popular creative writing courses preach) unless there’s plenty of conflict and resolution, nobody will be interested.

Mythology

We all have our own unspoken narratives about our own place in the world, and in most cases these probably cast us in a more heroic light than the accounts others might give of us. They help to maintain our emotional equilibrium, and in cases where they are nullified by external circumstances, depression, despair and even suicide may result. And of course with the internet, bloggers like me can start to inflict our own narratives on a bigger potential audience than ever (I wish). And earlier theories of the world are of course entirely myth and narrative laden, from ancient Greek culture to the Bible and the Koran. Our cultural heritage fits around our instinctive nature. (As I tap this passage into my phone on the train, the man immediately next to me is engrossed in a Bible.)  How difficult, then, for us to depart from our myths, and embrace a new, evidence based, and no longer human-centred, story of of creation.

t-rexStorr encounters one of those for whom this difficulty is too great. John Mackay is a genial, avuncular Australian (there’s plenty of footage on You Tube) who has been proselytising worldwide on behalf of the creationist story for some years. In the gospel according to Mackay, everything that seems evil about nature stems from the fall of Adam and Eve in the Garden of Eden. Before this there were no thorns on plants, men lived happily with dinosaurs and nobody ever got eaten – all animals were vegetarians. A favourite of mine among his pronouncements is where Storr asks him why, if Tyrannosaurus Rex was a vegetarian, it had such big, sharp teeth. The answer (of course) is – water melons.

On You Tube I have just watched Mackay demonstrating from fossil evidence that there are plants and animals which haven’t evolved at all. This is a fundamental misunderstanding of Darwin: if organisms aren’t required by external forces to adapt, they won’t. But of course on Mackay’s time scale (the Earth is of course six thousand years old) there wouldn’t have been enough time for fossils to form, let alone for anything to evolve very much. The confusion here is manifold. For his part, Storr admits to having started out knowing little about evolution theory or the descent of man, and to having taken the scientific account on trust, as indeed most of us do. But his later discussions with a mainstream scientist demonstrate to him how incomparably more elegant and cogent the accepted scientific narrative is.

How objective can we be?

Henderson charges Storr with not giving sufficient recognition to the importance of the scientific method, and how it has developed as a defence of objective knowledge against humanity’s partial and capricious tendencies. But Storr seems to me to be well aware of this, and alert to those investigators whose partiality throws doubt on their conclusions. ‘Confirmation bias’ is a phrase that runs through the book: people tend to notice evidence which supports a belief they are loyal to, and neglect anything that throws doubt on it. A great example comes from a passage in the book where he joins a tour of a former Nazi concentration camp with David Irving, the extreme right wing historian who has devoted his life to a curious crusade to minimise the significance of the Holocaust, and exculpate Hitler. Storr is good on the man’s bizarre and contradictory character, as well as the motley group of admirers touring with him. At one point Irving is seeking to deny the authenticity of a gas chamber they are viewing, and triumphantly points out a handle on the inside of the open door. He doesn’t bother to look at the other side of the door, and but Storr does afterwards, and discovers a very sturdy bolt. You are led to imagine the effect of a modus operandi like this on the countless years of painstaking research that Irving has pursued.

But should we assume that the model of disinterested, peer-reviewed academic research we have developed has broken free of our myth-making tendencies? Those who are the most determined to support it are of course themselves human beings, with their own hero narratives. Storr attends a convention of ‘Skeptics’ (he uses the American spelling, as they themselves do) where beliefs in such things as creationism or belief in psychic phenomena are held up to ridicule. He brings out well the slightly obsessional, anorak-ish atmosphere of the event. It does, after all, seem a little perverse to devote very much time to debunking the beliefs of others, rather than developing one’s own. It’s as if the hero narrative ‘I don’t believe in mumbo-jumbo’ is being exploited for purposes of mutual and self-congratulation. The man who is effectively Skeptic-in-Chief, the American former magician James Randi, is later interviewed by Storr, and comes across as arrogant and overbearing, admitting to sometimes departing from truthfulness in pursuit of his aims.

If scientists, being human, are not free of personal mythology, could this work against the objectivity of the enterprise? I think it can, and has. Some examples come to mind for me. The first is Ignaz Semmelweis, a Hungarian physician in the early to mid 19th century. In the days before Pasteur and the germ theory of infection, he was concerned by the number of maternal deaths in his hospital from what was called ‘puerperal fever’. This seemed to be worse in births attended by doctors, rather than midwives. In a series of well executed investigations, he linked this to doctors who had come to the maternity ward after performing post-mortems, and further established that hand-washing reduced the incidence of the disease. But the notion that doctors themselves were the cause did not meet with approval: an obvious clash with the hero narrative. Semmelweis’s findings were contemptuously rejected, and he later suffered a breakdown and died in an asylum. A similar example is the English Victorian physician John Snow, who in a famous investigation into cholera in Soho, conclusively showed it to be spread via a water-pump, in contradiction with the accepted ‘miasma’ theory of airborne infection. He further linked it to pollution of the water supply by sewage – but something so distasteful was a conclusion too far for the Victorian narratives of human pride and decency, and the miasma theory continued to hold sway.

Both these examples of course come from medicine, where conclusive and repeatable results are harder to come by, and easier to contest. So let’s go to the other extreme – mathematics. You would think that a mathematical theorem would be incontrovertible, at least on grounds of offending anyone’s personal sensibilities. But around the turn of the 20th century Georg Cantor’s work on set theory led him to results concerning the nature of infinity. The consequent attacks on him by some other mathematicians, often of the most ad hominem kind, calling him a charlatan and worse, showed that someone’s personal myths were threatened. Was it their religious beliefs, or the foundations of mathematics on which their reputations depended? I don’t know – but Cantor’s discoveries are nowadays part of mainstream mathematics.

Modern heresy

My examples are from the past, of course: I wanted to look at investigators who were derided in their time, but whose discoveries have since been vindicated. If there are any such people around today, their vindication lies in the future. And there is no shortage of heterodox doctrines, as Storr shows. Are any of them remotely plausible? One interesting case is that of Rupert Sheldrake, to whom Storr gives some space. He has an unimpeachable background of education in biology and is a former Cambridge fellow. But his theory of morphic fields – mysterious intangible influences on biological processes – put him beyond the pale as far as most mainstream scientists are concerned. Sheldrake, however, is adamant that his theory makes testable predictions, and he claims to have verified some of these using approved, objective methods. Some of them concern phenomena known to popular folk-lore: the ability to sense when you are being stared at, and animals who show correct anticipation of their absent owners returning home. I can remember when we played games with the former when I was at school – and it seemed to work. And I have read Sheldrake’s book on the latter, in which he is quite convincing.

But I have no idea whether these ideas are truly valid. Storr tells of a few cases where regular scientists have been prepared to try and repeat Sheldrake’s results with these phenomena, but most degenerate into arcane wrangling over the details of experimental method, and no clear conclusions emerge. What is clear to me is that most orthodox scientists will not even consider, publicly, such matters, since doing so is seen as career suicide. Is this lack of open-mindedness also therefore a lack of objectivity? Lewis Wolpert is quoted in Storr’s book: ‘An open mind is a very bad thing – everything falls out’, a jibe repeated by Henderson. You could retort that the trouble with a closed mind is that nothing gets in.  There is a difficult balance to find here: of course a successful scientific establishment must be on its guard against destructive incursions by gullibility and nonsense.  On the other hand, as we have seen, this becomes part of the hero narrative of its practitioners, and may be guarded so jealously that it becomes in some cases an obstacle to advances.

Sheldrake tells Storr that his theories in no way destroy or undermine established knowledge, but add to it. I think this is a little disingenuous of him. If we have missed something so fundamental, it would imply that there is something fundamentally wrong about our world-view. Well of course it would be arrogant to deny that there is anything at all wrong with our world-view (and I think there is plenty – something to return to in a later post). But Storr’s critic Henderson is surely right in holding that, in the context of a systematically developed body of knowledge, there is a greater burden of proof on the proponent of the unorthodox belief than there is on the the opponent. Nevertheless, I agree with Storr that the freedom to promote heterodox positions is essential, even if most of them are inevitably barmy. It’s not just that, as Storr asserts near the end of the book, ‘wrongness is a human right’. Occasionally – just very occasionally – it is right.