The Mathematician and the Surgeon

Commuting days until retirement: 108

After my last post, which, among other things, compared differing attitudes to death and its aftermath (or absence of one) on the part of Arthur Koestler and George Orwell, here’s another fruitful comparison. It seemed to arise by chance from my next two commuting books, and each of the two people I’m comparing, as before, has his own characteristic perspective on that matter. Unlike my previous pair both could loosely be called scientists, and in each case the attitude expressed has a specific and revealing relationship with the writer’s work and interests.

The Mathematician

The first writer, whose book I came across by chance, has been known chiefly for mathematical puzzles and games. Martin Gardner was born in Oklahoma USA in 1914; his father was an oil geologist, and it was a conventionally Christian household. Although not trained as a mathematician, and going into a career as a journalist and writer, Gardner developed a fascination with mathematical problems and puzzles which informed his career – hence the justification for his half of my title.

Martin Gardner

Gardner as a young man (Wikimedia)

This interest continued to feed the constant books and articles he wrote, and he was eventually asked to write the Scientific American column Mathematical Games which ran from 1956 until the mid 1980s, and for which he became best known; his enthusiasm and sense of fun shines through the writing of these columns. At the same time he was increasingly concerned with the many types of fringe beliefs that had no scientific foundation, and was a founder member of PSICOPS,  the organisation dedicated to the exposing and debunking of pseudoscience. Back in February last year I mentioned one of its other well-known members, the flamboyant and self-publicising James Randi. By contrast, Gardner was mild-mannered and shy, averse from public speaking and never courting publicity. He died in 2010, leaving behind him many admirers and a two-yearly convention – the ‘Gathering for Gardner‘.

Before learning more about him recently, and reading one of his books, I had known his name from the Mathematical Games column, and heard of his rigid rejection of things unscientific. I imagined some sort of skinflint atheist, probably with a hard-nosed contempt for any fanciful or imaginative leanings – however sane and unexceptionable they might be – towards what might be thought of as things of the soul.

How wrong I was. His book that I’ve recently read, The Whys of a Philosophical Scrivener, consists of a series of chapters with titles of the form ‘Why I am not a…’ and he starts by dismissing solipsism (who wouldn’t?) and various forms of relativism; it’s a little more unexpected that determinism also gets short shrift. But in fact by this stage he has already declared that

I myself am a theist (as some readers may be surprised to learn).

I was surprised, and also intrigued. Things were going in an interesting direction. But before getting to the meat of his theism he spends a good deal of time dealing with various political and economic creeds. The book was written in the mid 80s, not long before the collapse of communism, which he seems to be anticipating (Why I am not a Marxist) . But equally he has little time for Reagan or Thatcher, laying bare the vacuity of their over-simplistic political nostrums (Why I am not a Smithian).

Soon after this, however, he is striding into the longer grass of religious belief: Why I am not a Polytheist; Why I am not a Pantheist; – so what is he? The next chapter heading is a significant one: Why I do not Believe the Existence of God can be Demonstrated. This is the key, it seems to me, to Gardner’s attitude – one to which I find myself sympathetic. Near the beginning of the book we find:

My own view is that emotions are the only grounds for metaphysical leaps.

I was intrigued by the appearance of the emotions in this context: here is a man whose day job is bound up with his fascination for the powers of reason, but who is nevertheless acutely conscious of the limits of reason. He refers to himself as a ‘fideist’ – one who believes in a god purely on the basis of faith, rather than any form of demonstration, either empirical or through abstract logic. And if those won’t provide a basis for faith, what else is there but our feelings? This puts Gardner nicely at odds with the modish atheists of today, like Dawkins, who never tires of telling us that he too could believe if only the evidence were there.

But at the same time he is squarely in a religious tradition which holds that ultimate things are beyond the instruments of observation and logic that are so vital to the secular, scientific world of today. I can remember my own mother – unlike Gardner a conventional Christian believer – being very definite on that point. And it reminds me of some of the writings of Wittgenstein; Gardner does in fact refer to him,  in the context of the freewill question. I’ll let him explain:

A famous section at the close of Ludwig Wittgenstein’s Tractatus Logico-Philosophicus asserts that when an answer cannot be put into words, neither can the question; that if a question can be framed at all, it is possible to answer it; and that what we cannot speak about we should consign to silence. The thesis of this chapter, although extremely simple and therefore annoying to most contemporary thinkers, is that the free-will problem cannot be solved because we do not know exactly how to put the question.

This mirrors some of my own thoughts about that particular philosophical problem – a far more slippery one than those on either side of it often claim, in my opinion (I think that may be a topic for a future post). I can add that Gardner was also on the unfashionable side of the question which came up in my previous post – that of an afterlife; and again he holds this out as a matter of faith rather than reason. He explores the philosophy of personal identity and continuity in some detail, always concluding with the sentiment ‘I do not know. Do not ask me.’ His underlying instinct seems to be that there has to something more than our bodily existence, given that our inner lives are so inexplicable from the objective point of view – so much more than our physical existence. ‘By faith, I hope and believe that you and I will not disappear for ever when we die.’ By contrast, Arthur Koestler, you may remember,  wrote in his suicide note of ‘tentative hopes for a depersonalised afterlife’ – but, as it turned out, these hopes were based partly on the sort of parapsychological evidence which was anathema to Gardner.

And of course Gardner was acutely aware of another related mystery – that of consciousness, which he finds inseparable from the issue of free will:

For me, free will and consciousness are two names for the same thing. I cannot conceive of myself being self-aware without having some degree of free will… Nor can I imagine myself having free will without being conscious.

He expresses utter dissatisfaction with the approach of arch-physicalists such as Daniel Dennett, who,  as he says,  ‘explains consciousness by denying that it exists’. (I attempted to puncture this particular balloon in an earlier post.)

Martin Gardner

Gardner in later life (Konrad Jacobs / Wikimedia)

Gardner places himself squarely within the ranks of the ‘mysterians’ – a deliberately derisive label applied by their opponents to those thinkers who conclude that these matters are mysteries which are probably beyond our capacity to solve. Among their ranks is Noam Chomsky: Gardner cites a 1983 interview with the grand old man of linguistics,  in which he expresses his attitude to the free will problem (scroll down to see the relevant passage).

The Surgeon

And so to the surgeon of my title, and if you’ve read one of my other blog posts you will already have met him – he’s a neurosurgeon named Henry Marsh, and I wrote a post based on a review of his book Do No Harm. Well, now I’ve read the book, and found it as impressive and moving as the review suggested. Unlike many in his profession, Marsh is a deeply humble man who is disarmingly honest in his account about the emotional impact of the work he does. He is simultaneously compelled towards,  and fearful of, the enormous power of the neurosurgeon both to save and to destroy. His narrative swings between tragedy and elation, by way of high farce when he describes some of the more ill-conceived management ‘initiatives’ at his hospital.

A neurosurgical operation

A neurosurgical operation (Mainz University Medical Centre)

The interesting point of comparison with Gardner is that Marsh – a man who daily manipulates what we might call physical mind-stuff – the brain itself – is also awed and mystified by its powers:

There are one hundred billion nerve cells in our brains. Does each one have a fragment of consciousness within it? How many nerve cells do we require to be conscious or to feel pain? Or does consciousness and thought reside in the electrochemical impulses that join these billions of cells together? Is a snail aware? Does it feel pain when you crush it underfoot? Nobody knows.

The same sense of mystery and wonder as Gardner’s; but approached from a different perspective:

Neuroscience tells us that it is highly improbable that we have souls, as everything we think and feel is no more or no less than the electrochemical chatter of our nerve cells… Many people deeply resent this view of things, which not only deprives us of life after death but also seems to downgrade thought to mere electrochemistry and reduces us to mere automata, to machines. Such people are profoundly mistaken, since what it really does is upgrade matter into something infinitely mysterious that we do not understand.

Henry Marsh

Henry Marsh

This of course is the perspective of a practical man – one who is emphatically working at the coal face of neurology, and far more familiar with the actual material of brain tissue than armchair speculators like me. While I was reading his book, although deeply impressed by this man’s humanity and integrity, what disrespectfully came to mind was a piece of irreverent humour once told to me by a director of a small company I used to work for which was closely connected to the medical industry. It was a sort of a handy cut-out-and-keep guide to the different types of medical practitioner:

Surgeons do everything and know nothing. Physicians know everything and do nothing. Psychiatrists know nothing and do nothing.  Pathologists know everything and do everything – but the patient’s dead, so it’s too late.

Grossly unfair to all to all of them, of course, but nonetheless funny, and perhaps containing a certain grain of truth. Marsh, belonging to the first category, perhaps embodies some of the aversion from dry theory that this caricature hints at: what matters to him ultimately, as a surgeon, is the sheer down-to-earth physicality of his work, guided by the gut instincts of his humanity. We hear from him about some members of his profession who seem aloof from the enormity of the dangers it embodies, and seem able to proceed calmly and objectively with what he sees almost as the detachment of the psychopath.

Common ground

What Marsh and Gardner seem to have in common is the instinct that dry, objective reasoning only takes you so far. Both trust the power of their own emotions, and their sense of awe. Both, I feel, are attempting to articulate the same insight, but from widely differing standpoints.

Two passages, one from each book, seem to crystallize both the similarities and differences between the respective approaches of the two men, both of whom seem to me admirably sane and perceptive, if radically divergent in many respects. First Gardner, emphasising in a Wittgensteinian way how describing how things appear to be is perhaps a more useful activity than attempting to pursue any ultimate reasons:

There is a road that joins the empirical knowledge of science with the formal knowledge of logic and mathematics. No road connects rational knowledge with the affirmations of the heart. On this point fideists are in complete agreement. It is one of the reasons why a fideist, Christian or otherwise, can admire the writings of logical empiricists more than the writings of philosophers who struggle to defend spurious metaphysical arguments.

And now Marsh – mystified, as we have seen, as to how the brain-stuff he manipulates daily can be the seat of all experience – having a go at reading a little philosophy in the spare time between sessions in the operating theatre:

As a practical brain surgeon I have always found the philosophy of the so-called ‘Mind-Brain Problem’ confusing and ultimately a waste of time. It has never seemed a problem to me, only a source of awe, amazement and profound surprise that my consciousness, my very sense of self, the self which feels as free as air, which was trying to read the book but instead was watching the clouds through the high windows, the self which is now writing these words, is in fact the electrochemical chatter of one hundred billion nerve cells. The author of the book appeared equally amazed by the ‘Mind-Brain Problem’, but as I started to read his list of theories – functionalism, epiphenomenalism, emergent materialism, dualistic interactionism or was it interactionistic dualism? – I quickly drifted off to sleep, waiting for the nurse to come and wake me, telling me it was time to return to the theatre and start operating on the old man’s brain.

I couldn’t help noticing that these two men – one unconventionally religious and the other not religious at all – seem between them to embody those twin traditional pillars of the religious life: faith and works.

On Being Set Free

Commuting days until retirement: 133

The underlying theme of this blog is retirement, and it will be fairly obvious to most of my readers by now – perhaps indeed to all three of you – that I’m looking forward to it. It draws closer; I can almost hear the ‘Happy retirement’ wishes from colleagues – some expressed perhaps through ever-so-slightly gritted teeth as they look forward to many more years in harness, while I am put out to graze. But of course there’s another side to that: they will also be keeping silent about the thought that being put out to graze also carries with it the not too distant prospect of the knacker’s yard – something they rarely think about in relation to themselves.

Because in fact the people I work with are generally a lot younger than I am – in a few cases younger than my children. No one in my part of the business has ever actually retired, as opposed to leaving for another job. My feeling is that to stand up and announce that I am going to retire will be to introduce something alien and faintly distasteful into the prevailing culture, like telling everyone about your arthritis at a 21st birthday party.

The revolving telescope

For most of my colleagues, retirement,  like death, is something that happens to other people. In my experience, it’s around the mid to late 20s that such matters first impinge on the consciousness – indistinct and out of focus at first, something on the edge of the visual field. It’s no coincidence, I think, that it’s around that same time that one’s perspective on life reverses, and the general sense that you’d like to be older and more in command of things starts to give way to an awareness of vanishing youth. The natural desire for what is out of reach reorientates its outlook, swinging through 180 degrees like a telescope on a revolving stand.

But I find that, having reached the sort of age I am now, it’s doesn’t do to turn your back on what approaches. It’s now sufficiently close that it is the principal factor defining the shape of the space you now have available in which to organise your life,  and you do much better not to pretend it isn’t there, but to be realistically aware. We have all known those who nevertheless keep their backs resolutely turned, and they often cut somewhat pathetic figures: a particular example I remember was a man (who would almost certainly be dead by now) who didn’t seem to accept his failing prowess at tennis as an inevitable corollary of age, but rather a series of inexplicable failures that he should blame himself for. And there are all those celebrities you see with skin stretched ever tighter over their facial bones as they bring in the friendly figure of the plastic surgeon to obscure the view of where they are headed.

Perhaps Ray Kurzweil, who featured in my previous post, is another example, with his 250 supplement tablets each day and his faith in the abilities of technology to provide him with some sort of synthetic afterlife.  Given that he has achieved a generous measure of success in his natural life, he perhaps has less need than most of us to seek a further one; but maybe it works the other way, and a well-upholstered ego is more likely to feel a continued existence as its right.

Enjoying the view

Old and Happy

Happiness is not the preserve of the young (Wikimedia Commons)

But the fact is that for most of us the impending curtailment of our time on earth brings a surprising sense of freedom. With nothing left to strive for – no anxiety about whether this or that ambition will be realised – some sort of summit is achieved. The effort is over,  and we can relax and enjoy the view. More than one survey has found that people in their seventies are nowadays collectively happier than any other age group: here are reports of three separate studies between 2011 and 2014, in Psychology Today, The Connexion, and the Daily Mail. Those adverts for pension providers and so on, showing apparently radiant wrinkly couples feeding the ducks with their grandchildren, aren’t quite as wide of the mark as you might think.

Speaking for myself, I’ve never been excessively troubled by feelings of ambition, and have probably enjoyed a relatively stress-free, if perhaps less prosperous, life as a result. And the prospect of an existence where I am no longer even expected to show such aspirations is part of the attraction of retirement. But of course there remain those for whom the fact of extinction gives rise to wholly negative feelings, but who are at the same time brave enough to face it fair and square, without any psychological or cosmetic props. A prime example in recent literature is Philip Larkin, who seems to make frequent appearances in this blog. While famously afraid of death, he wrote luminously about it. Here, in his poem The Old Fools he evokes images of the extreme old age which he never, in fact, reached himself:

Philip Larkin

Philip Larkin (Fay Godwin)

Perhaps being old is having lighted rooms
Inside your head, and people in them, acting.
People you know, yet can’t quite name; each looms
Like a deep loss restored, from known doors turning,
Setting down a lamp, smiling from a stair, extracting
A known book from the shelves; or sometimes only
The rooms themselves, chairs and a fire burning,
The blown bush at the window, or the sun’s
Faint friendliness on the wall some lonely
Rain-ceased midsummer evening.

Dream and reality seem to fuse at this ultimate extremity of conscious experience as Larkin portrays it; and it’s the snuffing out of consciousness that a certain instinct in us finds difficult to take – indeed, to believe in. Larkin, by nature a pessimist, certainly believed in it,  and dreaded it. But cultural traditions of many kinds have not accepted extinction as inevitable: we are not obliviously functioning machines but the subjects of experiences like the ones Larkin writes about. As such we have immortal souls which transcend the gross physical world, it has been held – so why should we not survive death? (Indeed, according some creeds, why should we not have existed before birth?)

Timid hopes

Well, whatever immortal souls might be, I find it difficult to make out a case for individual survival, and this is perhaps the majority view in the secular culture I inhabit. It seems pretty clear to me that my own distinguishing characteristics are indissolubly linked to my physical body: damage to the brain, we know, can can change the personality, and perhaps rob us of our memories and past experience, which most quintessentially define us as individuals. But even though our consciousness can be temporarily wiped out by sleep or anaesthetics, there remains the sense (for me, anyway) that since we have no notion whatever of how we could provide an account of it in physical terms,  there is the faint suggestion that some aspect of our experience could be independent of our bodily existence.

You may or may not accept both of these beliefs – the temporality of the individual and the transcendence of consciousness. But if you do,  then the possibility seems to arise of some kind of disembodied,  collective sentience,  beyond our normal existence. And this train of thought always reminds me of the writer Arthur Koestler, who died by suicide in 1983 at the age of 77. An outspoken advocate of voluntary euthanasia, he’d been suffering in later life from Parkinson’s disease, and had then contracted a progressive, incurable form of leukaemia. His suicide note (which turned out to have been written several months before his death) included the following passage:

I wish my friends to know that I am leaving their company in a peaceful frame of mind, with some timid hopes for a de-personalised after-life beyond due confines of space, time and matter and beyond the limits of our comprehension. This ‘oceanic feeling’ has often sustained me at difficult moments, and does so now, while I am writing this.

Death sentence

In fact Koestler had, since he was quite young, been more closely acquainted with death than most of us. Born in Hungary, during his earlier career as a journalist and political writer he twice visited Spain during its civil war in the 1930s. He made his first visit as an undercover investigator of the Fascist movement, being himself at that time an enthusiastic supporter of communism. A little later he returned to report from the Republican side,  but was in Malaga when it was captured by Fascist troops. By now Franco had come to know of his anti-fascist writing, and he was imprisoned in Seville under sentence of death.

Koestler portrayed on the cover of the book

Koestler portrayed on the cover of the book

In his account of this experience, Dialogue with Death, he describes how prisoners would try to block their ears to avoid the nightly sound of a telephone call to the prison, when a list of prisoner names would be dictated and the men later led out and shot. His book is illuminating on the psychology of these conditions,  and the violent emotional ups and downs he experienced:

One of my magic remedies was a certain quotation from a certain work of Thomas Mann’s; its efficacy never failed. Sometimes, during an attack of fear, I repeated the same verse thirty or forty times, for almost an hour, until a mild state of trance came on and the attack passed. I knew it was the method of the prayer-mill, of the African tom-tom, of the age-old magic of sounds. Yet in spite of my knowing it, it worked…
I had found out that the human spirit is able to call upon certain aids of which, in normal circumstances, it has no knowledge, and the existence of which it only discovers in itself in abnormal circumstances. They act, according to the particular case, either as merciful narcotics or ecstatic stimulants. The technique which I developed under the pressure of the death-sentence consisted in the skilful exploitation of these aids. I knew, by the way, that at the decisive moment when I should have to face the wall, these mental devices would act automatically, without any conscious effort on my part. Thus I had actually no fear of the moment of execution; I only feared the fear which would precede that moment.

That there are emotional ‘ups’ at all seems surprising,  but later he expands on one of them:

Often when I wake at night I am homesick for my cell in the death-house in Seville and, strangely enough, I feel that I have never been so free as I was then. This is a very strange feeling indeed. We lived an unusual life on that patio; the constant nearness of death weighed down and at the same time lightened our existence. Most of us were not afraid of death, only of the act of dying; and there were times when we overcame even this fear. At such moments we were free – men without shadows, dismissed from the ranks of the mortal; it was the most complete experience of freedom that can be granted a man.

Perhaps, in a diluted, much less intense form, the happiness of the over 70s revealed by the surveys I mentioned has something in common with this.

Koestler was possibly the only writer of the front rank ever to be held under sentence of death, and the experience informed his novel Darkness at Noon. It is the second in a trilogy of politically themed novels, and its protagonist, Rubashov, has been imprisoned by the authorities of an unnamed totalitarian state which appears to be a very thinly disguised portrayal of Stalinist Russia. Rubashov has been one of the first generation of revolutionaries in a movement which has hardened into an authoritarian despotism, and its leader, referred to only as ‘Number One’ is apparently eliminating rivals.  Worn down by the interrogation conducted by a younger, hard-line apparatchik, Rubashov comes to accept that he has somehow criminally acted against ‘the revolution’, and eventually goes meekly to his execution.

Shades of Orwell

By the time of writing the novel, Koestler, like so many intellectuals of that era, had made the journey from an initial enthusiasm for Soviet communism to disillusion with,  and opposition to it. And reading Darkness at Noon, I was of course constantly reminded of Orwell’s Nineteen Eighty-Four, and the capitulation of Winston Smith as he comes to love Big Brother. Darkness at Noon predates 1984 by nine years,  and nowadays has been somewhat eclipsed by Orwell’s much more well known novel. The two authors had met briefly during the Spanish civil war, where Orwell was actively involved in fighting against fascism, and met again and discussed politics around the end of the war. It seems clear that Orwell, having written his own satire on the Russian revolution in Animal Farm, eventually wrote 1984 under the conscious influence of Koestler’s novel. But they are of course very different characters: you get the feeling that to Orwell, with his both-feet-on-the-ground Englishness, Koestler might have seemed a rather flighty and exotic creature.

Orwell (aka Eric Blair) from the photo on his press pass (NUJ/Wikimedia Commons)

Orwell (aka Eric Blair) from the photo on his press pass (Wikimedia Commons)

In fact,  during the period between the publications of Darkness at Noon and 1984, Orwell wrote an essay on Arthur Koestler – probably while he was still at work on Animal Farm. His view of Koestler’s output is mixed: on one hand he admires Koestler as a prime example of the continental writers on politics whose views have been forged by hard experience in this era of political oppression – as opposed to English commentators who merely strike attitudes towards the turmoil in Europe and the East, while viewing it from a relatively safe distance. Darkness at Noon he regards as a ‘masterpiece’ – its common ground with 1984 is not, it seems, a coincidence. (Orwell’s review of Darkness at Noon in the New Statesman is also available.)

On the other hand he finds much of Koestler’s work unsatisfactory, a mere vehicle for his aspirations towards a better society. Orwell quotes Koestler’s description of himself as a ‘short-term pessimist’,  but also detects a utopian undercurrent which he feels is unrealistic. His own views are expressed as something more like long-term pessimism, doubting whether man can ever replace the chaos of the mid-twentieth century with a society that is both stable and benign:

Nothing is in sight except a welter of lies, hatred, cruelty and ignorance, and beyond our present troubles loom vaster ones which are only now entering into the European consciousness. It is quite possible that man’s major problems will NEVER be solved. But it is also unthinkable! Who is there who dares to look at the world of today and say to himself, “It will always be like this: even in a million years it cannot get appreciably better?” So you get the quasi-mystical belief that for the present there is no remedy, all political action is useless, but that somewhere in space and time human life will cease to be the miserable brutish thing it now is. The only easy way out is that of the religious believer, who regards this life merely as a preparation for the next. But few thinking people now believe in life after death, and the number of those who do is probably diminishing.

In death as in life

Orwell’s remarks neatly return me to the topic I have diverged from. If we compare the deaths of the two men, they seem to align with their differing attitudes in life. Both died in the grip of a disease – Orwell succumbing to tuberculosis after his final, gloomy novel was completed, and Koestler escaping his leukaemia by suicide but still expressing ‘timid hopes’.

After the war Koestler had adopted England as his country and henceforth wrote only in English – most of his previous work had been in German. In  being allowed a longer life than Orwell to pursue his writing, he had moved on from politics to write widely in philosophy and the history of ideas, although never really being a member of the intellectual establishment. These are areas which you feel would always have been outside the range of the more down-to-earth Orwell, who was strongly moral,  but severely practical. Orwell goes on to say, in the essay I quoted: ‘The real problem is how to restore the religious attitude while accepting death as final.’ This so much reflects his attitudes – he habitually enjoyed attending Anglican church services, but without being a believer. He continues, epigramatically:

Men can only be happy when they do not assume that the object of life is happiness. It is most unlikely, however, that Koestler would accept this. There is a well-marked hedonistic strain in his writings, and his failure to find a political position after breaking with Stalinism is a result of this.

Again, we strongly feel the tension between their respective characters: Orwell, with his English caution, and Koestler with his continental adventurism. In fact, Koestler had a reputation as something of an egotist and aggressive womaniser. Even his suicide reflected this: it was a double suicide with his third wife, who was over 20 years younger than he was and in good health. Her accompanying note explained that she couldn’t continue her life without him. Friends confirmed that she had entirely subjected her life to his: but to what extent this was a case of bullying,  as some claimed, will never be known.

Of course there was much common ground between the two men: both were always on the political left, and both,  as you might expect, were firmly opposed to capital punishment: anyone who needs convincing should read Orwell’s autobiographical essay A Hanging. And Koestler wrote a more prosaic piece – a considered refutation of the arguments for judicial killing – in his book Reflections on Hanging; it was written in the 1950s, when, on Koestler’s own account, some dozen hangings were occurring in Britain each year.

But while Orwell faced his death stoically, Koestler continued his dalliance with the notion of some form of hereafter; you feel that, as with Kurzweil, a well-developed ego did not easliy accept the thought of extinction. In writing this post, I discovered that he had been one of a number of intellectual luminaries who contributed to a collection of essays under the title Life after Death,  published in the 1970s. Keen to find a more detailed statement of his views, I actually found his piece rather disappointing. First I’ll sketch in a bit of background to clarify where I think he is coming from.

Back in Victorian times there was much interest in evidence of ‘survival’ – seances and table-rapping sessions were popular, and fraudulent mediums were prospering. Reasons for this are not hard to find: traditional religion, while strong, faced challenges. Steam-powered technology was burgeoning, the world increasingly seemed to be a wholly mechanical affair,  and Darwinism had arrived to encourage the trend towards materialism. In 1882 the Society for Psychical Research was formed, becoming a focus both for those who were anxious to subvert the materialist world view, and those who wanted to investigate the phenomena objectively and seek intellectual clarity.

But it wasn’t long before the revolution in physics, with relativity and quantum theory, exploded the mechanical certainties of the Victorians. At the same time millions suffered premature deaths in two world wars, giving ample motivation to believe that those lost somehow still existed and could maybe even be contacted.

Arthur Koestler

Koestler in later life (Eric Koch/Wikimedia Commons)

This seems to be the background against which Koestler’s ideas about the possibility of an afterlife had developed. He leans a lot on the philosophical writings of the quantum physicist Edwin Schrodinger, and seeks to base a duality of mind and matter on the wave/particle duality of quantum theory. There’s a lot of talk about psi fields and suchlike – the sort of terminology which was already sounding dated at the time he was writing.  The essay seemed to me to be rather backward looking, sitting more comfortably with the inchoate fringe beliefs of the mid 20th century than the confident secularism of Western Europe today.

A rebel to the end

I think Koestler was well aware of the way things were going, but with characteristic truculence reacted against them. He wrote a good deal on topics that clash with mainstream science, such as the significance of coincidence, and in his will used his legacy to establish a department of parapsychology,  which was set up at Edinburgh University, and still exists.

This was clearly a deliberate attempt to cock a snook at the establishment, and while he was not an attractive character in many ways I do find this defiant stance makes me warm to him a little. While I am sure I would have found Orwell more decent and congenial to know personally, Koestler is the more intellectually exciting of the two. I think Orwell might have found Koestler’s notion of the sense of freedom when facing death difficult to understand – but maybe this might have changed had he survived into his seventies. And in a general sense I share Koestler’s instinct that in human consciousness there is far more yet to understand than we have yet been able to, as it were, get our minds around.

Retirement, for me, will certainly bring freedom – not only freedom from the strained atmosphere of worldly ambition and corporate business-speak (itself an Orwellian development) but more of my own time to reflect further on the matters I’ve spoken of here.

iPhobia

Commuting days until retirement: 238

If you have ever spoken at any length to someone who is suffering with a diagnosed mental illness − depression, say, or obsessive compulsive disorder − you may have come to feel that what they are experiencing differs only in degree from your own mental life, rather than being something fundamentally different (assuming, of course, that you are lucky enough not to have been similarly ill yourself). It’s as if mental illness, for the most part, is not something entirely alien to the ‘normal’ life of the mind, but just a distortion of it. Rather than the presence of a new unwelcome intruder, it’s more that the familiar elements of mental functioning have lost their usual proportion to one another. If you spoke to someone who was suffering from paranoid feelings of persecution, you might just feel an echo of them in the back of your own mind: those faint impulses that are immediately squashed by the power of your ability to draw logical common-sense conclusions from what you see about you. Or perhaps you might encounter someone who compulsively and repeatedly checks that they are safe from intrusion; but we all sometimes experience that need to reassure ourselves that a door is locked, when we know perfectly well that it is really.

That uncomfortably close affinity between true mental illness and everyday neurotic tics is nowhere more obvious than with phobias. A phobia serious enough to be clinically significant can make it impossible for the sufferer to cope with everyday situations; while on the other hand nearly every family has a member (usually female, but not always) who can’t go near the bath with a spider in it, as well as a member (usually male, but not always) who nonchalantly picks the creature up and ejects it from the house. (I remember that my own parents went against these sexual stereotypes.) But the phobias I want to focus on here are those two familiar opposites − claustrophobia and agoraphobia.

We are all phobics

In some degree, virtually all of us suffer from them, and perfectly rationally so. Anyone would fear, say, being buried alive, or, at the other extreme, being launched into some limitless space without hand or foothold, or any point of reference. And between the extremes, most of us have some degree of bias one way or the other. Especially so − and this is the central point of my post − in an intellectual sense. I want to suggest that there is such a phenomenon as an intellectual phobia: let’s call it an iphobia. My meaning is not, as the Urban Dictionary would have it, an extreme hatred of Apple products, or a morbid fear of breaking your iPhone. Rather, I want to suggest that there are two species of thinkers: iagorophobes and iclaustrophobes, if you’ll allow me such ugly words.

A typical iagorophobe will in most cases cleave to scientific orthodoxy. Not for her the wide open spaces of uncontrolled, rudderless, speculative thinking. She’s reassured by a rigid theoretical framework, comforted by predictability; any unexplained phenomenon demands to be brought into the fold of existing theory, for any other way, it seems to her, lies madness. But for the iclaustrophobe, on the other hand, it’s intolerable to be caged inside that inflexible framework. Telepathy? Precognition? Significant coincidence? Of course they exist; there is ample anecdotal evidence. If scientific orthodoxy can’t embrace them, then so much the worse for it − the incompatibility merely reflects our ignorance. To this the iagorophobe would retort that we have no logical grounds whatever for such beliefs. If we have nothing but anecdotal evidence, we have no predictability; and phenomena that can’t be predicted can’t therefore be falsified, so any such beliefs fall foul of the Popperian criterion of scientific validity. But why, asks the iclaustrophobe, do we have to be constrained by some arbitrary set of rules? These things are out there − they happen. Deal with it. And so the debate goes.

Archetypal iPhobics

Widening the arena more than somewhat, perhaps the archetypal iclaustrophobe was Plato. For him, the notion that what we see was all we would ever get was anathema – and he eloquently expressed his iclaustrophobic response to it in his parable of the cave. For him true reality was immeasurably greater than the world of our everyday existence. And of course he is often contrasted with his pupil Aristotle, for whom what we can see is, in itself, an inexhaustibly fascinating guide to the nature of our world − no further reality need be posited. And Aristotle, of course, is the progenitor of the syllogism and deductive logic. In Raphael’s famous fresco The School of Athens, the relevant detail of which you see below, Plato, on the left, indicates his world of forms beyond our immediate reality by pointing heavenward, while Aristotle’s gesture emphasises the earth, and the here and now. Raphael has them exchanging disputatious glances, which for me express the hostility that exists between the opposed iphobic world-views to this day.

School of Athens

Detail from Raphael’s School of Athens in the Vatican, Rome (Wikimedia Commons)

iPhobia today

It’s not surprising that there is such hostility; I want to suggest that we are talking not of a mere intellectual disagreement, but a situation where each side insists on a reality to which the other has a strong (i)phobic reaction. Let’s look at a specific present-day example, from within the WordPress forums. There’s a blog called Why Evolution is True, which I’d recommend as a good read. It’s written by Jerry Coyne, a distinguished American professor of biology. His title is obviously aimed principally at the flourishing belief in creationism which exists in the US − Coyne has extensively criticised the so-called Intelligent Design theory. (In in my view, that controversy is not a dispute between the two iphobias I have described, but between two forms of iagoraphobia. The creationists, I would contend, are locked up in an intellectual ghetto of their own making, since venturing outside it would fatally threaten their grip on their frenziedly held, narrowly based faith.)

Jerry Coyne

Jerry Coyne (Zooterkin/Wikimedia Commons)

But I want to focus on another issue highlighted in the blog, which in this case is a conflict between the two phobias. A year or so ago Coyne took issue with the fact that the maverick scientist Rupert Sheldrake was given a platform to explain his ideas in the TED forum. Note Coyne’s use of the hate word ‘woo’, often used by the orthodox in science as an insulting reference to the unorthodox. They would defend it, mostly with justification, as characterising what is mystical or wildly speculative, and without evidential basis − but I’d claim there’s more to it than that: it’s also the iagorophobe’s cry of revulsion.

Rupert Sheldrake

Rupert Sheldrake (Zereshk/Wikimedia Commons)

Coyne has strongly attacked Sheldrake on more than one occasion: is there anything that can be said in Sheldrake’s defence? As a scientist he has an impeccable pedigree, having a Cambridge doctorate and fellowship in biology. It seems that he developed his unorthodox ideas early on in his career, central among which is his notion of ‘morphic resonance’, whereby animal and human behaviour, and much else besides, is influenced by previous similar behaviour. It’s an idea that I’ve always found interesting to speculate about − but it’s obviously also a red rag to the iagorophobic bull. We can also mention that he has been careful to describe how his theories can be experimentally confirmed or falsified, thus claiming scientific status for them. He also invokes his ideas to explain aspects of the formation of organisms that, in to date, haven’t been explained by the action of DNA. But increasing knowledge of the significance of what was formerly thought of as ‘junk DNA’ is going a long way to filling these explanatory gaps, so Sheldrake’s position looks particularly weak here. And in his TED talks he not only defends his own ideas, but attacks many of the accepted tenets of current scientific theory.

However, I’d like to return to the debate over whether Sheldrake should be denied his TED platform. Coyne’s comments led to a reconsideration of the matter by the TED editors, who opened a public forum for discussion on the matter. The ultimate, not unreasonable, decision was that the talks were kept available, but separately from the mainstream content. Coyne said he was surprised by the level of invective arising from the discussion; but I’d say this is because we have here a direct confrontation between iclaustrophobes and iagorophobes − not merely a polite debate, but a forum where each side taunts the other with notions for which the opponents have a visceral revulsion. And it has always been so; for me the iphobia concept explains the rampant hostility which always characterises debates of this type − as if the participants are not merely facing opposed ideas, but respective visions which invoke in each a deeply rooted fear.

I should say at this point that I don’t claim any godlike objectivity in this matter; I’m happy to come out of the closet as an iclaustrophobe myself. This doesn’t mean in my case that I take on board any amount of New Age mumbo-jumbo; I try to exercise rational scepticism where it’s called for. But as an example, let’s go back to Sheldrake: he’s written a book about the observation that housebound dogs sometimes appear to show marked  excitement at the moment that their distant owner sets off to return home, although there’s no way they could have knowledge of the owner’s actions at that moment. I have no idea whether there’s anything in this − but the fact is that if it were shown to be true nothing would give me greater pleasure. I love mystery and inexplicable facts, and for me they make the world a more intriguing and stimulating place. But of course Coyne isn’t the only commentator who has dismissed the theory out of hand as intolerable woo. I don’t expect this matter to be settled in the foreseeable future, if only because it would be career suicide for any mainstream scientist to investigate it.

Science and iPhobia

Why should such a course of action be so damaging to an investigator? Let’s start by putting the argument that it’s a desirable state of affairs that such research should be eschewed by the mainstream. The success of the scientific enterprise is largely due to the rigorous methodology it has developed; progress has resulted from successive, well-founded steps of theorising and experimental testing. If scientists were to spend their time investigating every wild theory that was proposed their efforts would become undirected and diffuse, and progress would be stalled. I can see the sense in this, and any self-respecting iagorophobe would endorse it. But against this, we can argue that progress in science often results from bold, unexpected ideas that come out of the blue (some examples in a moment). While this more restrictive outlook lends coherence to the scientific agenda, it can, just occasionally, exclude valuable insights. To explain why the restrictive approach holds sway I would look at the how a person’s psychological make-up might influence their career choice. Most iagorophobes are likely to be attracted to the logical, internally consistent framework they would be working with as part of a scientific career; while those of an iclaustrophobic profile might be attracted in an artistic direction. Hence science’s inbuilt resistance to out-of-the-blue ideas.

Albert Einstein

Albert Einstein (Wikimedia Commons)

I may come from the iclaustrophobe camp, but I don’t want to claim that only people of that profile are responsible for great scientific innovations. Take Einstein, who may have had an early fantasy of riding on a light beam, but it was one which led him through rigorous mathematical steps to a vastly coherent and revolutionary conception. His essential iagorophbia is seen in his revulsion from the notion of quantum indeterminacy − his ‘God does not play dice’. Relativity, despite being wholly novel in its time, is often spoken of as a ‘classical’ theory, in the sense that it retains the mathematical precision and predictability of the Newtonian schema which preceded it.

Niels Bohr

Niels Bohr (Wikimedia Commons)

There was a long-standing debate between him and Niels Bohr, the progenitor of the so-called Copenhagen interpretation of quantum theory, which held that different sub-atomic scenarios coexisted in ‘superposition’ until an observation was made and the wave function collapsed. Bohr, it seems to me, with his willingness to entertain wildly counter-intuitive ideas, was a good example of an iclaustrophobe; so it’s hardly surprising that the debate between him and Einstein was so irreconcilable − although it’s to the credit of both that their mutual respect never faltered..

Over to you

Are you an iclaustrophobe or an iagorophobe? A Plato or an Aristotle? A Sheldrake or a Coyne? A Bohr or an Einstein? Or perhaps not particularly either? I’d welcome comments from either side, or neither.

The Vault of Heaven

Commuting days until retirement: 250

Exeter Cathedral roof

The roof of Exeter Cathedral (Wanner-Laufer, Wikimedia Commons)

Thoughts are sometimes generated out of random conjunctions in time between otherwise unrelated events. Last week we were on holiday in Dorset, and depressing weather for the first couple of days drove us into the nearest city – Exeter, where we visited the cathedral. I had never seen it before and was more struck than I had expected to be. Stone and wood carvings created over the past 600 years decorate thrones, choir stalls and tombs, the latter bearing epitaphs ranging in tone from the stern to the whimsical. All this lies beneath the marvellous fifteeenth century vaulted roof – the most extensive known of the period, I learnt. Looking at this, and the cathedral’s astronomical clock dating from the same century, I imagined myself seeing them as a contemporary member of the congregation would have, and tried to share the medieval conception of the universe above that roof, reflected in the dial of the clock.

Astronomical Clock

The Astronomical Clock at Exeter Cathedral (Wikimedia Commons)

The other source of these thoughts was the book I happened to have finished that day: Max Tegmark’s Our Mathematical Universe*. He’s an MIT physics professor who puts forward the view (previously also hinted at in this blog) that reality is at bottom simply a mathematical object. He admits that it’s a minority view, scoffed at by many of his colleagues – but I have long felt a strong affinity for the idea. I have reservations about some aspects of the Tegmark view of reality, but not one of its central planks – the belief that we live in one universe among a host of others. Probably to most people the thought is just a piece of science fiction fantasy – and has certainly been exploited for all it’s worth by fiction authors in recent years. But in fact it is steadily gaining traction among professional scientists and philosophers as a true description of the universe – or rather multiverse, as it’s usually called in this context.

Nowadays there is a whole raft of differing notions of a multiverse, each deriving from separate theoretical considerations. Tegmark combines four different ones in the synthesis he presents in the book. But I think I am right in saying that the first time such an idea appeared in anything like a mainstream scientific context was the PhD thesis of a 1950s student at Princeton in the USA – Hugh Everett.

The thesis appeared in 1957; its purpose was to present an alternative treatment of the quantum phenomenon known as the collapse of the wave function. A combination of theoretical and experimental results had come together to suggest that subatomic particles (or waves – the duality was a central idea here) existed as a cloud of possibilities, until interacted with, or observed. The position of an electron, for example could be defined with a mathematical function – the wave function of Schrodinger – which assigned only a probability to each putative location. If, however, we were to put this to the test – to measure its location in practice, we would have to do this by means of some interaction, and the answer that would come back would be one specific position among the cloud of possibilities. But by carrying out such procedures repeatedly, it was shown that the probability of any specific result was given by the wave function. The approach to these results which became most widely accepted was the so-called ‘Copenhagen interpreatation’ of Bohr and others, which held that all the possible locations co-existed in ‘superposition’ until the measurement was made and the wave function ‘collapsed’. Hence some of the more famous statements about the quantum world: Einstein’s dissatisfaction with the idea that ‘God plays dice’; and Schrodinger’s well-known thought experiment aimed to test the Copenhagen interpretation to destruction – the cat which is presumed to be simultaneously dead and alive until its containing box is opened and the result determined.

Everett proposed that there was no such thing as the collapse of the wave function. Rather, each of the possible outcomes was represented in one real universe; it was as if the universe ‘branched’ into a number of equally real versions, and you, the observer, found yourself in just one of them. Of course, it followed that many copies of you each found themselves in slightly different circumstances, unlike the unfortunate cat which presumably only experienced those universes in which it lived. Needless to say, although Everett’s ideas were encouraged at the time by a handful of colleagues (Bryce DeWitt, John Wheeler) they were regarded for many years as a scientific curiosity and not taken further. Everett himself moved away from theoretical physics, and involved himself in practical technology, later developing an enjoyment of programming. He smoked and drank heavily and became obese, dying at the age of 51. Tegmark implies that this was at least partly a result of his neglect by the theoretical physics community – but there’s also evidence that his choices of career path and lifestyle derived from his natural inclinations.

During the last two decades of the 20th century, however, the multiverse idea began to be taken more seriously, and had some enthusiastic proponents such as the British theorist David Deutsch and indeed Tegmark himself.  In his book, Tegmark cites a couple of straw polls he took among theoretical physicists attending talks he gave, in 1997 and again in 2010. In the first case, out of a response of 48, 13 endorse the Copenhagen interpretation, and 8 the multiverse idea. (The remainder are mostly undecided, with a few endorsing alternative approaches). In 2010 there are 35 respondents, of whom none at all go for Copenhagen, and 16 for the multiverse. (Undecideds remain about the same – to 16 from 18). This seems to show a decisive rise in support for multiple universes; although I do wonder whether it also reflects which physicists who were prepared to attend Tegmark’s talks, his views having become more well known by 2010. It so happens that the drop in the respondent numbers – 13 – is the same as the disappearing support for the Copenhagen interpreation.

Nevertheless, it’s fair to say that the notion of a multiple universe as a reality has now entered the mainstream of theoretical science in a way that it had not done half a century ago. There’s an argument, I thought as I looked at that cathedral roof, that cosmology has been transformed even more radically in my lifetime than it had been in the preceding 500 years. The skill of the medieval stonemasons as they constructed the multiple rib vaults, and the wonder of the medieval congregation as they marvelled at the completed roof, were consciously directed to the higher vault of heaven that overarched the world of their time. Today those repeated radiating patterns might be seen as a metaphor for the multiple worlds that we are, perhaps, beginning dimly to discern.


*Tegmark, Max, Our Mathematical Universe: My Quest for the Ultimate Nature of Reality. Allen Lane/Penguin, 2014

Accident of Birth

Commuting days until retirement: 390

My commuter train reading in recent weeks has been provided by Hilary Mantel’s two Mann Booker Prize-winning historical novels, Wolf Hall and Bring up the Bodies. If you don’t know, they are the first two of what is promised to be a trilogy covering the life of Thomas Cromwell, who rose to be Henry VIII’s right hand man. He’s a controversial figure in history: you may have seen Robert Bolt’s play (or the film of) A Man for All Seasons, where he is portrayed as King Henry’s evil arch-fixer, who engineers the execution of the man of the title, Sir Thomas More. He is also known to have had a big part in the downfall and death of Anne Boleyn.

The unique approach of Mantel’s account is to narrate exclusively from Cromwell’s own point of view. At the opening of the first book he is being violently assaulted by the drunken, irresponsible blacksmith father whom he subsequently escapes, seeking a fortune abroad as a very young man, and living on his very considerable wits. On his return to England, having gained wide experience and the command of several languages, he progresses quickly within the establishment, becoming a close advisor to Cardinal Wolsey, and later, of course, Henry VIII. I won’t create spoilers for the books by going into further detail – although if you are familiar with the relevant history you will already know some of these. I’ll just mention that in Mantel’s portrayal he emerges as phenomenally quick-witted, but loyal to those he serves. She shows him as an essentially unassuming man, well aware of his own abilities, and stoical whenever he suffers reverses or tragedies. These qualities give him a resilience which aids his rise to some of the highest offices in the England of his time. In the books we are privy to his dreams, and his relationships with his family – although he might appear to some as cold-blooded, he is also a man of natural feelings and passions.

Thomas Cromwell and the Duke of Norfolk

Thomas Cromwell (left) and Thomas Howard, Duke of Norfolk – both as portrayed by Hans Holbein

But the theme that kicked off my thoughts for this post was that of Cromwell’s humble origin. It’s necessarily central to the books, given that it was rare then for someone without nobility or inherited title to achieve the rank that he did. What Mantel brings out so well is the instinctive assumption that an individual’s value is entirely dependent on his or her inheritance – unquestioned in that time, as throughout most of history until the modern era. As the blacksmith’s son from Putney, Cromwell is belittled by his enemies and teased by his friends. But at the same time we watch him, with his realistic and perceptive awareness of his own position, often running rings around various blundering earls and dukes, and even subtly manipulating the thinking of the King. My illustrations show Cromwell himself and Thomas Howard, Duke of Norfolk, a jealous opponent. By all accounts Norfolk was a rather simple, plain-speaking man, and certainly without Cromwell’s intellectual gifts. So today we would perhaps see Cromwell as better qualified for the high office that both men held. But seen through 16th century eyes, Cromwell would be the anomaly, and Norfolk, with his royal lineage, the more natural holder of a seat in the Privy Council.

Throughout history there have of course been persistent outbreaks of protest from those disempowered by accident of birth. But the fundamental issues have often often obscured by the chaos and competition for privilege which result. We can most obviously point to the 18th century, with the convulsion of the French revolution, which resulted in few immediate benefits; and the foundation of a nation – America – on the ideals of equality and freedom, followed however by its enthusiastic maintenance of slavery for some years. Perhaps it wasn’t until the 19th century, and the steady, inexorable rise of the middle class, that fundamental change began. As this was happening, Darwin came along to ram home the point that any intrinsic superiority on the basis of your inheritance was illusory. Everyone’s origins were ultimately the same; what counted was how well adapted you were to the external conditions you were born into. But was this the same for human beings as for animals? The ability to thrive in the environment in which you found yourself was certainly a measure of utilitarian, or economic value. But is this the scale on which we should value humans? It’s a question that I’ll try to show there’s s still much confusion about today. Meanwhile Karl Marx was analysing human society in terms of class and mass movements, moving the emphasis away from the value of individuals – a perspective which had momentous consequences in the century to come.

But fundamental attitudes weren’t going to change quickly. In England the old class system was fairly steady on its feet until well into the 20th century. My own grandmother told me about the time that her father applied to enrol her brothers at a public school (i.e. a private school, if you’re not used to British terminology). This would have been, I estimate, between about 1905 and 1910. The headmaster of the school arrived at their house in a horse and trap to look the place over and assess their suitability. My great-grandfather had a large family, with a correspondingly large house, and all the servants one would then have had to keep the place running. He was a director of a successful wholesale grocery company – and hearing this, the headmaster politely explained that, being “in trade” he didn’t qualify as a father of sons who could be admitted. Had he been maybe a lawyer, or a clergyman, there would have been no problem.

Let’s move on fifty years or so, to the start of the TV age. It’s s very instructive to watch British television programmes from this era – or indeed films and newsreels. Presenters and commentators all have cut-glass accents that today, just 60 or so years on, appear to us impossibly affected and artificial. The working class don’t get much of a look in at all: in the large numbers of black-and-white B-movies that were turned out at this time the principal actors have the accents of the ruling class, while working class characters appear either as unprincipled gangster types, or as lovable ‘cheekie chappies’ showing proper deference to their masters.

By this time, staying with Britain, we had the 1944 Education Act, which had the laudable motive of making a suitable education available to all, regardless of birth. But how to determine what sort of education would be right for each child? We had the infamous eleven plus exam, where in a day or two of assessment the direction of your future would be set. While looking forward to a future of greater equality of opportunity, the conception seemed simultaneously mired in the class stratification of the past, where each child had a predetermined role and status, which no one, least of all the child himself or herself, could change. Of course this was a great step up for bright working class children who might otherwise have been neglected, and instead received a fitting education at grammar schools. Thomas Cromwell, in a different age, could have been the archetypal grammar school boy.

But given the rigid stratification of the system, it’s not surprising that within 20 years left wing administrations started to change things again. While the reforming Labour government of 1945-51 had many other things to concentrate on, the next one, achieving office in 1964, made education a priority, abolishing the 11 plus and introducing comprehensive schools. This established the framework which is only now starting to be seriously challenged by the policies of the current coalition government. Was the comprehensive project successful, and does it need challenging now? I’d argue that it does.

R A Butler

R A “Rab” Butler
(izquotes.com)

To return to basics, it seems to me that what’s at stake is, again, how you value an individual human being. In Cromwell’s time as we’ve seen, no one doubted that it was all to do with the status of your forbears. But by 1944 the ambitious middle class had long been a reality, showing that you could prove your value and rise to prosperity regardless of your origins. This was now a mass phenomenon, not confined to very unusual and lucky individuals, as it had been with Cromwell. And so education realigned itself around the new social structure. But with the education minister of the time, R.A. Butler, being a patrician (if liberal-minded) Tory, perhaps it was inevitable that something of the rigidity of the old class structure would be carried over into the new education system.

So if an exam at the age of eleven effectively determines your place in society, how are we now valuing human beings? It’s their intellectual ability, and their consequent economic value which is the determining factor. If you succeed you go to a grammar school to be primed for university, while if not, you may be given a condescending pat on the head and steered towards a less intellectually demanding trade. We would all agree that there is a more fundamental yardstick against which we measure individuals – an intrinsic, or moral value. We’d rate the honest low-achiever over the clever crook. But somehow the system, with its rigid and merciless classification, is sweeping the more important criterion aside.

Anthony Crosland

Anthony Crosland
(stpancrasstory.org)

And so the reforming zeal of the 1960s Labour government was to remove those class-defining barriers and provide the same education for all. The education minister of that time was a noted intellectual – private school and Oxford educated – Anthony Crosland. His reported remark, supposedly made to his wife, serves to demonstrate the passion of the project: “If it’s the last thing I do, I’m going to destroy every fucking grammar school in England. And Wales and Northern Ireland”. (In Northern Ireland, it should be noted, he was less successful than elsewhere). But the remark also suggests a fixity of purpose which spread to the educational establishment for many years to come. If it was illegitimate to value children unequally, then in no circumstances should this be done.

You may or may not agree with me that the justified indignation of the time was leading to a fatal confusion between the two yardsticks I distinguished – the economic one and the moral one. And so, by the lights of Labour at that time, if we are allocating different resources to children according to their aptitudes – well, we shouldn’t. All must be equal. Yes – in the moral sense. But in the economic one? Even Karl Marx made that distinction – remember his famous slogan, “From each according to his ability, to each according to his need”?  All that the reformists needed to do, in my opinion, was to take the rigidity out of the system – to let anyone aspire to a new calling that he or she can achieve, at whatever age, and under whatever circumstances that their need arises.

Back to personal experience. I can remember when we were looking over primary schools for our first child – this would be in the early 90s. One particular headmaster bridled when my wife asked about provision for children of different abilities. The A-word was clearly not to be used. Yet as he talked on, there were several times that he visibly recognised that he himself was about to use it, spotted the elephant trap at the last moment, and awkwardly stepped around it. This confused man was in thrall to the educational establishment’s fixed, if unconscious, assumption that differing ability equals unequal value. (We didn’t send our children to that school.)

Over the years, these attitudes have led to a frequent refusal to make any provision for higher ability pupils, with the consequence that talent which might previously have been nurtured, has been ignored. If you can afford it, of course, you can buy your way out of the system and opt for a private education. Private school pupils have consistently had the lion’s share of places at the top universities, and so the architects and supporters of the state system ideology have called for the universities to be forced to admit more applicants from that system, and to restrict those from the private sector. Is this right? I’d argue that the solution to failure in the state schools is not to try and extend the same failed ideology to the universities, but to try to address what is wrong in the schools. A confusion between our economic and moral valuations of individual threatens to lead to consequences which are damaging, it seems to me, both in an economic and a moral sense.

The plans of the present UK education minister, Michael Gove, have come in for a lot of criticism. It would be outside the scope of this piece – and indeed my competence – to go into that in detail, but it does seem to me that he is making a principled and well intentioned attempt to restore the proper distinction between those economic and moral criteria – making good use of individual ability where it can be found, without being condescending to those who are not so academic, or making the distinctions between them too rigid. And of course I haven’t addressed the issue of whether the existence of a separate private education sector is desirable – again outside the scope of this post.

Martin Luther King

Martin Luther King
(Nobel Foundation)

What, at least, all now agree on is that the original criterion of individual value we looked at – birth status – is no longer relevant. Well, almost all. Racist ideologies, of course, persist in the old attitude. A recent anniversary has reminded us of one of the defining speeches of the 20th century, that of Martin Luther King, who laid bare the failure of the USA to uphold the principles of its constitution, and famously looked forward to a time when people would be “judged not by the color of their skin but by the content of their character”. The USA, whose segregationist policies in some states he was addressing, has certainly made progress since then. But beyond the issues I have described, there are many further problems around the distinction between moral and economic values. In most societies there are those whose contribution is valued far more in the moral sense than the economic one: nurses, teachers. What, if if anything, should we do about that? I don’t claim to know any easy answers.

I kicked off from the themes in Hilary Mantel’s books and embarked on a topic which I soon realised was a rather unmanageably vast one for a simple blog post. Along the way I have been deliberately contentious – please feel free to agree or disagree in the comments below. But what got me going was the way in which Mantel’s study of Cromwell takes us into the collective mind of an age when the instinctive ways of evaluating individuals were entirely different. What I don’t think anyone can reasonably disagree with is the importance of history in throwing the prejudices of our own age into a fresh and revealing perspective.

Consciousness 3 – The Adventures of a Naive Dualist

Commuting days until retirement: 408

A long gap since my last post: I can only plead lack of time and brain-space (or should I say mind-space?). Anyhow, here we go with Consciousness 3:

Coronation

A high point for English Christianity in the 50s: the Queen’s coronation. I can remember watching it on a relative’s TV at the age of 5

I think I must have been a schoolboy, perhaps just a teenager, when I was first aware that the society I had been born into supported two entirely different ways of looking at the world. Either you believed that the physical world around us, sticks, stones, fur, skin, bones – and of course brains – was all that existed; or you accepted one of the many varieties of belief which insisted that there was more to it than that. My mental world was formed within the comfortable surroundings of the good old Church of England, my mother and father being Christians by conviction and by social convention, respectively. The numinous existed in a cosy relationship with the powers-that-were, and parents confidently consigned their children’s dead pets to heaven, without there being quite such a Santa Claus feel to the assertion.

But, I discovered, it wasn’t hard to find the dissenting voices. The ‘melancholy long withdrawing roar’ of the ‘sea of faith’ which Matthew Arnold had complained about in the 19th century was still under way, if you listened out for it. Ever since Darwin, and generations of physicists from Newton onwards, the biological and physical worlds had appeared to get along fine without divine support; and even in my own limited world I was aware of plenty of instances of untimely deaths of innocent sufferers, which threw doubt on God’s reputedly infinite mercy.

John Robinson

John Robinson, Bishop of Woolwich (Church Times)

And then in the 1960s a brick was thrown into the calm pool of English Christianity by a certain John Robinson, the Bishop of Woolwich at the time. It was a book called Honest to God, which sparked a vigorous debate that is now largely forgotten. Drawing on the work of other radical theologians, and aware of the strong currents of atheism around him, Robinson argued for a new understanding of religion. He noted that our notion of God had moved on from the traditional old man in the sky to a more diffuse being who was ‘out there’, but considered that this was also unsatisfactory. Any God whom someone felt they had proved to be ‘out there’ “would merely be a further piece of existence, that might conceivably have not been there”. Rather, he says, we must approach from a different angle.

God is, by definition, ultimate reality. And one cannot argue whether ultimate reality exists.

My pencilled zig-zags in the margin of the book indicate that I felt there was something wrong with this at the time. Later, after studying some philosophy, I recognised it as a crude form of Anselm’s ontological argument for the existence of God, which is rather more elegant, but equally unsatisfactory. But, to be fair, this is perhaps missing the point a little. Robinson goes on to say that “one can only ask “what ultimate reality is like – whether it… is to be described in personal or impersonal categories.” His book proceeds to develop the notion of God as in some way identical with reality, rather than as a special part of it. One might cynically characterise this as a response to atheism of the form “if you can’t beat them, join them” – hence the indignation that the book stirred in religious circles.

Teenage reality

But, leaving aside the well worn blogging topic of the existence of God, there was the teenage me, still wondering about ‘ultimate reality’, and what on earth, for want of a better expression, that might be. Maybe the ‘personal’ nature of reality which Robinson espoused was a clue. I was a person, and being a person meant having thoughts, experiences – a self, or a subjective identity.  My experiences seemed to be something quite other from the objective world described by science – which, according to the ‘materialists’ of the time, was all that there was. What I was thinking of then was the topic of my previous post, Consciousness 2 – my qualia, although I didn’t know that word at the time. So yes, there were the things around us (including our own bodies and brains), our knowledge and understanding of which had been, and was, advancing at a great rate. But it seemed to me that no amount of knowledge of the mechanics of the world could ever explain these private, subjective experiences of mine (and I assumed, of others). I was always strongly motivated to believe that there was no limit to possible knowledge – however much we knew, there would always be more to understand. Materialsm, on the other hand, seemed to embody the idea of a theoretically finite limit to what could be known – a notion which gave me a sense of claustrophobia (of which more in a future post).

So I made my way about the world, thinking of my qualia as the armour to fend off the materialist assertion that physics was the whole story. I had something that was beyond their reach: I was a something of a young Cartesian, before I had learned about Descartes. It was a another few years before ‘consciousness’ became a legitimate topic of debate in philosophy and science. One commentator I have read dates this change to the appearance of Nagel’s paper What is it like to be a Bat in 1973, which I referred to in Consciousness 1. Seeing the debate emerging, I was tempted to preen myself with the horribly arrogant thought that the rest of the world had caught up with me.

The default position

Philosophers and scientists are still seeking to find ways of assimilating consciousness to physics: such physicalism, although coming in a variety of forms, is often spoken of as the default, orthodox position. But although my perspective has changed quite a lot over the years, my fundamental opposition to physicalism has not. I am still at heart the same naive dualist I was then. But I am not a dogmatic dualist – my instinct is to believe that some form of monism might ultimately be true, but beyond our present understanding. This consigns me into another much-derided category of philosophers – the so-called ‘mysterians’.

But I’d retaliate by pointing out that there is also a bit of a vacuum at the heart of the physicalist project. Thoughts and feelings, say its supporters, are just physical things or events, and we know what we mean by that, don’t we? But do we? We have always had the instinctive sense of what good old, solid matter is – but you don’t have to know any physics to realise there are problems with the notion. If something were truly solid it would entail that it was infinitely dense – so the notion of atomism, starting with the ancient Greeks, steadily took hold. But even then, atoms can’t be little solid balls, as they were once imagined – otherwise we are back with the same problem. In the 20th century, atomic physics confirmed this, and quantum theory came up with a whole zoo of particles whose behaviour entirely conflicted with our intuitive ideas gained from experience; and this is as you might expect, since we are dealing with phenomena which we could not, in principle, perceive as we perceive the things around us. So the question “What are these particles really like?” has no evident meaning. And, approaching the problem from another standpoint, where psychology joins hands with physics, it has become obvious that the world with which we are perceptually familiar is an elaborate fabrication constructed by our brains. To be sure, it appears to map on to the ‘real’ world in all sorts of ways, but has qualities (qualia?) which we supply ourselves.

Truth

So what true, demonstrable statements can be made about the nature of matter? We are left with the potently true findings – true in the the sense of explanatory and predictive power – of quantum physics. And, when you’ve peeled away all the imaginative analogies and metaphors, these can only be expressed mathematically. At this point, rather unexpectedly, I find myself handing the debate back to our friend John Robinson. In a 1963 article in The Observer newspaper, heralding the publication of Honest to God, he wrote:

Professor Herman Bondi, commenting in the BBC television programme, “The Cosmologists” on Sir James Jeans’s assertion that “God is a great mathematician”, stated quite correctly that what he should have said is “Mathematics is God”. Reality, in other words, can finally be reduced to mathematical formulae.

In case this makes Robinson sound even more heretical than he in fact was, I should note that he goes on to say that Christianity adds to this “the deeper reliability of an utterly personal love”. But I was rather gratified to find this referral to the concluding thoughts of my post by the writer I quoted at the beginning.

I’m not going to speculate any further into such unknown regions, or into religious belief, which isn’t my central topic. But I’d just like to finish with the hope that I have suggested that the ‘default position’ in current thinking about the mind is anything but natural or inevitable.

Science and Emotion

Commuting days until retirement: 507

heretics-coverFollowing my comments 3 posts ago, my reading on the train over the last week has been The Heretics: Adventures with the Enemies of Science, by Will Storr. Despite booming bishops and other distractions, I found it intensely readable, and I think it pretty much fulfilled my expectations.

The subtitle about ‘the Enemies of Science’ is perhaps a little misleading: this is not primarily an exercise in bashing such people – you’ll find plenty of other books that do that, and any number of internet forums (and of course I had a go at it myself, in the post I mentioned). It’s an attempt to understand them, and to investigate why they believe what they do. Storr does not treat the enemies of science as being necessarily his personal enemies, and it emerges at the same time that the friends of science are not always particularly – well, friendly.

I was going to say that it’s a strength of the book that he maintains this approach without compromising his own adherence to rationality, but that’s not strictly true. Because another strength is that he doesn’t attempt to adopt a wholly godlike, objective view. Rather, he presents himself, warts and all, as a creature like the rest of us, who has had to fight his own emotional demons in order to arrive at some sort of objectivity. And he does admit to experiencing a kind of bewitchment when talking to people with far-out beliefs. ‘They are magic-makers. And, beneath all that a private undercurrent: I feel a kind of kinship with them. I am drawn to the wrong.’

It’s a subtle approach, and a difficult path to tread, which invites misunderstanding. And one critic who I believe misunderstands is Mark Henderson in the Guardian who, while admiring aspects of Storr’s work, finds the book ‘disappointing and infuriating….  He is like the child who still wants to believe in Father Christmas, but who is just old enough to know better. Life would be more magical, more fun, if the story were true.’  Well here I think Henderson is unwittingly stating the central thesis of Storr’s book: that as humans we are above all myth makers – we have a need to see ourselves as a hero of our own preferred narrative.

This idea appeals to me in particular because it chimes in with ideas that I have come across in an evening course I am currently following in the philosophy of emotion. Writers on emotion quite naturally classify emotion into positive (happiness, love, hope, excitement) and negative (sadness, hatred, anger, fear, etc). The naive assumption is that we welcome the former and avoid the latter if we can. But of course the reality is far more nuanced, and more interesting than that. In the first place is the need of many to pursue dangerous and frightening pursuits, and then of course the undoubted delights of sado-masochism. But much closer to home is the fact that we flock to horror films and emotionally harrowing dramas – we love to be vicariously frightened or distressed. Narrative is our stock in trade, and (as the increasingly popular creative writing courses preach) unless there’s plenty of conflict and resolution, nobody will be interested.

Mythology

We all have our own unspoken narratives about our own place in the world, and in most cases these probably cast us in a more heroic light than the accounts others might give of us. They help to maintain our emotional equilibrium, and in cases where they are nullified by external circumstances, depression, despair and even suicide may result. And of course with the internet, bloggers like me can start to inflict our own narratives on a bigger potential audience than ever (I wish). And earlier theories of the world are of course entirely myth and narrative laden, from ancient Greek culture to the Bible and the Koran. Our cultural heritage fits around our instinctive nature. (As I tap this passage into my phone on the train, the man immediately next to me is engrossed in a Bible.)  How difficult, then, for us to depart from our myths, and embrace a new, evidence based, and no longer human-centred, story of of creation.

t-rexStorr encounters one of those for whom this difficulty is too great. John Mackay is a genial, avuncular Australian (there’s plenty of footage on You Tube) who has been proselytising worldwide on behalf of the creationist story for some years. In the gospel according to Mackay, everything that seems evil about nature stems from the fall of Adam and Eve in the Garden of Eden. Before this there were no thorns on plants, men lived happily with dinosaurs and nobody ever got eaten – all animals were vegetarians. A favourite of mine among his pronouncements is where Storr asks him why, if Tyrannosaurus Rex was a vegetarian, it had such big, sharp teeth. The answer (of course) is – water melons.

On You Tube I have just watched Mackay demonstrating from fossil evidence that there are plants and animals which haven’t evolved at all. This is a fundamental misunderstanding of Darwin: if organisms aren’t required by external forces to adapt, they won’t. But of course on Mackay’s time scale (the Earth is of course six thousand years old) there wouldn’t have been enough time for fossils to form, let alone for anything to evolve very much. The confusion here is manifold. For his part, Storr admits to having started out knowing little about evolution theory or the descent of man, and to having taken the scientific account on trust, as indeed most of us do. But his later discussions with a mainstream scientist demonstrate to him how incomparably more elegant and cogent the accepted scientific narrative is.

How objective can we be?

Henderson charges Storr with not giving sufficient recognition to the importance of the scientific method, and how it has developed as a defence of objective knowledge against humanity’s partial and capricious tendencies. But Storr seems to me to be well aware of this, and alert to those investigators whose partiality throws doubt on their conclusions. ‘Confirmation bias’ is a phrase that runs through the book: people tend to notice evidence which supports a belief they are loyal to, and neglect anything that throws doubt on it. A great example comes from a passage in the book where he joins a tour of a former Nazi concentration camp with David Irving, the extreme right wing historian who has devoted his life to a curious crusade to minimise the significance of the Holocaust, and exculpate Hitler. Storr is good on the man’s bizarre and contradictory character, as well as the motley group of admirers touring with him. At one point Irving is seeking to deny the authenticity of a gas chamber they are viewing, and triumphantly points out a handle on the inside of the open door. He doesn’t bother to look at the other side of the door, and but Storr does afterwards, and discovers a very sturdy bolt. You are led to imagine the effect of a modus operandi like this on the countless years of painstaking research that Irving has pursued.

But should we assume that the model of disinterested, peer-reviewed academic research we have developed has broken free of our myth-making tendencies? Those who are the most determined to support it are of course themselves human beings, with their own hero narratives. Storr attends a convention of ‘Skeptics’ (he uses the American spelling, as they themselves do) where beliefs in such things as creationism or belief in psychic phenomena are held up to ridicule. He brings out well the slightly obsessional, anorak-ish atmosphere of the event. It does, after all, seem a little perverse to devote very much time to debunking the beliefs of others, rather than developing one’s own. It’s as if the hero narrative ‘I don’t believe in mumbo-jumbo’ is being exploited for purposes of mutual and self-congratulation. The man who is effectively Skeptic-in-Chief, the American former magician James Randi, is later interviewed by Storr, and comes across as arrogant and overbearing, admitting to sometimes departing from truthfulness in pursuit of his aims.

If scientists, being human, are not free of personal mythology, could this work against the objectivity of the enterprise? I think it can, and has. Some examples come to mind for me. The first is Ignaz Semmelweis, a Hungarian physician in the early to mid 19th century. In the days before Pasteur and the germ theory of infection, he was concerned by the number of maternal deaths in his hospital from what was called ‘puerperal fever’. This seemed to be worse in births attended by doctors, rather than midwives. In a series of well executed investigations, he linked this to doctors who had come to the maternity ward after performing post-mortems, and further established that hand-washing reduced the incidence of the disease. But the notion that doctors themselves were the cause did not meet with approval: an obvious clash with the hero narrative. Semmelweis’s findings were contemptuously rejected, and he later suffered a breakdown and died in an asylum. A similar example is the English Victorian physician John Snow, who in a famous investigation into cholera in Soho, conclusively showed it to be spread via a water-pump, in contradiction with the accepted ‘miasma’ theory of airborne infection. He further linked it to pollution of the water supply by sewage – but something so distasteful was a conclusion too far for the Victorian narratives of human pride and decency, and the miasma theory continued to hold sway.

Both these examples of course come from medicine, where conclusive and repeatable results are harder to come by, and easier to contest. So let’s go to the other extreme – mathematics. You would think that a mathematical theorem would be incontrovertible, at least on grounds of offending anyone’s personal sensibilities. But around the turn of the 20th century Georg Cantor’s work on set theory led him to results concerning the nature of infinity. The consequent attacks on him by some other mathematicians, often of the most ad hominem kind, calling him a charlatan and worse, showed that someone’s personal myths were threatened. Was it their religious beliefs, or the foundations of mathematics on which their reputations depended? I don’t know – but Cantor’s discoveries are nowadays part of mainstream mathematics.

Modern heresy

My examples are from the past, of course: I wanted to look at investigators who were derided in their time, but whose discoveries have since been vindicated. If there are any such people around today, their vindication lies in the future. And there is no shortage of heterodox doctrines, as Storr shows. Are any of them remotely plausible? One interesting case is that of Rupert Sheldrake, to whom Storr gives some space. He has an unimpeachable background of education in biology and is a former Cambridge fellow. But his theory of morphic fields – mysterious intangible influences on biological processes – put him beyond the pale as far as most mainstream scientists are concerned. Sheldrake, however, is adamant that his theory makes testable predictions, and he claims to have verified some of these using approved, objective methods. Some of them concern phenomena known to popular folk-lore: the ability to sense when you are being stared at, and animals who show correct anticipation of their absent owners returning home. I can remember when we played games with the former when I was at school – and it seemed to work. And I have read Sheldrake’s book on the latter, in which he is quite convincing.

But I have no idea whether these ideas are truly valid. Storr tells of a few cases where regular scientists have been prepared to try and repeat Sheldrake’s results with these phenomena, but most degenerate into arcane wrangling over the details of experimental method, and no clear conclusions emerge. What is clear to me is that most orthodox scientists will not even consider, publicly, such matters, since doing so is seen as career suicide. Is this lack of open-mindedness also therefore a lack of objectivity? Lewis Wolpert is quoted in Storr’s book: ‘An open mind is a very bad thing – everything falls out’, a jibe repeated by Henderson. You could retort that the trouble with a closed mind is that nothing gets in.  There is a difficult balance to find here: of course a successful scientific establishment must be on its guard against destructive incursions by gullibility and nonsense.  On the other hand, as we have seen, this becomes part of the hero narrative of its practitioners, and may be guarded so jealously that it becomes in some cases an obstacle to advances.

Sheldrake tells Storr that his theories in no way destroy or undermine established knowledge, but add to it. I think this is a little disingenuous of him. If we have missed something so fundamental, it would imply that there is something fundamentally wrong about our world-view. Well of course it would be arrogant to deny that there is anything at all wrong with our world-view (and I think there is plenty – something to return to in a later post). But Storr’s critic Henderson is surely right in holding that, in the context of a systematically developed body of knowledge, there is a greater burden of proof on the proponent of the unorthodox belief than there is on the the opponent. Nevertheless, I agree with Storr that the freedom to promote heterodox positions is essential, even if most of them are inevitably barmy. It’s not just that, as Storr asserts near the end of the book, ‘wrongness is a human right’. Occasionally – just very occasionally – it is right.

Believe it or not

Commuting days until retirement: 513

We commuters have our darker evenings, when everything goes wrong (why always evenings, when we’re on the way home, and not when we’re headed to work?) but there seem to have been remarkably few of those for a while. Until today.

Well, it wasn’t that bad – only an hour’s worth of delay. Seeing that my usual fast train was going to be very late I had made a snap decision and got on the non-delayed slow one. Mistake. We ended up stuck interminably in a station half-way, while everything, including the slow/fast one, overtook us.

But of course if you’re a reader there’s an up side to this. And I now have on my Kindle the Will Storr that I mentioned last Sunday, and was attempting to read that. I say attempting to – a couple of seats from me was a bishop, purple and splendid, and having a magisterial speaking voice to go with it. Clearly he was used to projecting his words around cathedrals, and so our railway carriage had very little chance. He was talking to his wife, who sat opposite him. I hadn’t really thought about it, but the decibels must be an occupational hazard if you’re the wife of a bishop. (Or the husband of a bishop, if we ever have any of those.) The subject matter, on the other hand, was not at all episcopal, but quite mundane, even though rendered beautifully and sonorously – perfect, in fact, for undermining the concentration.

And so that is how my exploration of why we believe what we believe was disrupted by a booming bishop.

A morning of Intuitive Soul Whispering

Commuting days until retirement: 518

bigquestionsSunday – so rather than struggling to the train to notch up another commuting day, I am in my dressing gown munching toast in front of the Andrew Marr Show. And I stay on with the telly for that strange confection called The Big Questions. From what I’ve seen of this before it’s rather prone to producing Small Answers. Today was no exception, particularly in the section on the topic “Is faith compatible with reason?” (A mere third of the programme devoted to this.) The programme demonstrates that faith which TV producers have, against all reason, that if you plonk enough people with extreme and diametrically opposed opinions in front of each other, and give them each a few moments each to rant at each other, that some enlightenment will come out of it all.  (Well, I know that their agenda is to deliver ratings, rather than enlightenment. But couldn’t Sunday morning ‘god-slot’ TV be viewed more as a public service, and less as a ratings deliverer? Unfortunately I think that the brains of the people who run BBC1 are hardwired to work the other way.)

On the programme, the most difficult participant to ignore – and certainly the most irritating – was a lady called Andrea Foulkes, an accomplished TV performer, who, as she was anxious to impress on us, has had her own show on ITV. Among the accomplishments mentioned on her website is that of being an “Intuitive Soul Whisperer”. Well, we all need one of those. And what did she have to say? I will have to quote (thank goodness for iPlayer, allowing me to check I’m quoting accurately):

Quantum physics is starting to prove that the heart has a cohesive wave-form – it has a pattern, which is replicated, which creates emotion… Everyone’s thoughts and beliefs come from three strains: they come from ancestral pattterns, which we call genetic… and then you have past life patterns; and then you have compounded stuff, which you have from  being in the womb to the present day, because you have consciousness in the womb. And this creates your current reality…

“What is your external proof?” asked someone.  “External proof?” she burbled on, sweetly tolerant of those too slow to keep up with her, “External proof is clients who have experienced it, and they change their reality… all realities exist, but they exist within different dimensions. We live in a multidimensional reality, we live in a holographic universe”. (“We live in a what?”, interjected a bemused presenter.)

Well of course you don’t need to be a specialist in any of the disciplines she skipped over to be able to recognise all this as piffle of a high order. Further debate showed her telegenic, coiffeured carapace to be case-hardened against any assault by reason or logic. Which rather played into the hands of the other participant who made an impression on me – this time a positive one.

Will Storr, I learnt, is just bringing out a book: The Heretics – Adventures with the Enemies of Science, which supports the idea that we often choose our beliefs at first out of emotionally derived motives, only then seeking to justify them by selectively adopting the arguments which support them. While this can’t be universally true, it’s an issue which has always interested me. His pre-publication reviewers on Amazon are generally approving, if a little lukewarm, but I feel it’s a must-read for me. It comes out on Thursday, and I’ve preordered it on my Kindle.

That will use up a few train journeys, and I hope to report back in a future post. So – thank you, BBC, and sorry for my carping above. I did get something out of The Big Questions after all.