What a Coincidence!

My title is an expression you hear quite often, the exclamation mark denoting how surprising it seems when, for example, you walk into a shop and find yourself behind your friend in the queue (especially if you were just thinking about her), or if perhaps the person at the next desk in your office turns out to have the same birthday as you.

But by considering the laws of probability you can come to the conclusion that such things are less unlikely than they seem. Here’s a way of looking at it: suppose you use some method of generating random numbers, say between 0 and 100, and then plot them as marks on a scale. You’ll probably find blank areas in some parts of the scale, and tightly clustered clumps of marks in others. It’s sometimes naively assumed that, if the numbers are truly random, they should be evenly spread across the scale. But a simple argument shows this to be mistaken: there are in fact relatively few ways to arrange the marks evenly, but a myriad ways of distributing them irregularly. Therefore, by elementary probability, it is overwhelmingly likely that any random arrangement will be of the irregular and clumped sort.

randomTo satisfy myself, I’ve just done this exercise – and to make it more visual I have generated the numbers as 100 pairs of dual coordinates, so that they are spread over a square. Already it looks gratifyingly clumpy, as probability theory predicts. So, to stretch and reapply the same idea, you could say it’s quite natural that contingent events in our lives aren’t all spaced out and disjointed from one another in a way that we might naively expect, but end up being apparently juxtaposed and connected in ways that seem surprising to us.

Isaac Asimov, the science fiction writer, put it more crisply:

People are entirely too disbelieving of coincidence. They are far too ready to dismiss it and to build arcane structures of extremely rickety substance in order to avoid it. I, on the other hand, see coincidence everywhere as an inevitable consequence of the laws of probability, according to which having no unusual coincidence is far more unusual than any coincidence could possibly be. (From The Planet that Wasn’t, originally published in The Magazine of Fantasy and Science Fiction, May 1975)

All there is to it?

So there we have the standard case for reducing what may seem like outlandish and mysterious coincidences to the mere operation of random chance. I have to admit, however, that I’m not entirely convinced by it. I have repeatedly experienced coincidences in my own life, from the trivial to the really pretty surprising – in a moment I’ll describe some of them. What I have noticed is that they often don’t have the character of being just random pairs or clusters of simple happenings, as you might expect, but seem to be linked to one another in strange and apparently meaningful ways, or to associate themselves with significant life events. Is this a mere subjective illusion, or could there be some hidden, organising principle governing happenings in our lives?

Brian Inglis

Brian Inglis, from the cover of Coincidence

I don’t have an answer to that, but I’m certainly not the first to speculate about the question. This post was prompted by a book I recently read, Coincidence by Brian Inglis*. Inglis was a distinguished and well-liked journalist in the last century, having been a formative editor of The Spectator magazine and a prolific writer of articles and books. He was also a television presenter: those of a certain age may remember a long-running historical series on ITV, All Our Yesterdays, which Inglis presented. In addition, to the distaste of some, he wrote quite widely on paranormal phenomena.

The joker

In Coincidence he draws on earlier speculators about the topic, including the Austrian zoologist Paul Kammerer, who, after being suspected of scientific fraud in his research into amphibians, committed suicide in 1926. Kammerer was an enthusiastic collector of coincidence stories, and tried to provide a theoretical underpinning for them with his idea of ‘seriality’, which had some influence on Jung’s notion of synchronicity, in which meaning is placed alongside causality in its power to determine events. Kammerer also attracted the attention of Arthur Koestler, who figures in one of my previous posts. Koestler gave an account of the fraud case which was sympathetic to Kammerer, in The Case of the Midwife Toad. Koestler was also fascinated by coincidences and wrote about them in his book The Roots of Coincidence. Inglis, in his own book, recounts many accounts of surprising coincidences from ordinary lives. Many of his subjects have the feeling that there is some sort of capricious organising spirit behind these confluences of events, whom Inglis playfully personifies as ‘the joker’.

This putative joker certainly seems to have had a hand in my own life a number of times. Thinking of the subtitle of Inglis’ book (‘A Matter of Chance – or Synchronicity?‘) the latter seems to be a factor with me. I have been so struck by the apparent significance of some of my own coincidences that I have recorded quite a number of them. First, here’s a simple example which shows that ‘interlinking’ tendency which occurs so often. (Names are changed in the accounts that follow.)

My own stories

From about 35 years ago: I spend an evening with my friend Suzy. We talk for a while about our mutual acquaintance Robert, whom we have both lost touch with; neither of us have seen him for a couple of years. Two days later, I park my car in a crowded North London street and Robert walks past just as I get out of the car, and I have a conversation with him. And then, I subsequently discover, the next day Suzy meets him quite by chance on a railway station platform. I don’t know whether the odds against this could be calculated, but they would be pretty huge. Each of the meetings, so soon after the conversation, would be unlikely, especially in crowded inner London as they were. And the pair of coincidences show this strange interlinking that I mentioned. But I have more examples which are linked to one another in an even more elaborate way, as well as being attached to significant life events.

In 1982 I decided that, after nearly 14 years, it was time to leave the first company I had worked for long-term; let’s call it ‘company A’. During my time with them, a while before this, I’d shared a flat with a couple of colleagues for 5 years. At one stage we had a vacancy in the flat and advertised at work for a third tenant. A new employee of the company – we’ll call him Tony McAllister – quickly showed an interest. We felt a slight doubt about the rather pushy way he did this, pulling down our notice so that no one else would see it. But he seemed pleasant enough, and joined the flat. We should have listened to our doubts – he turned out to be definitely the most uncongenial person I have ever lived with. He consistently avoided helping with any of the housework and other tasks around the flat, and delighted in dismantling the engine of his car in the living room. There were other undesirable personal habits – I won’t trouble you with the details. Fortunately it wasn’t long before we all left the flat, for other reasons.

Back to 1982, and my search for a new job. A particularly interesting sounding opportunity came up, in a different area of work, with another large company – company B. I applied and got an interview with a man who would be my new boss if I got the job: we’ll call him Mark Cooper. He looked at my CV. “You worked at company A – did you know Tony McAllister? He’s one of my best friends.” Putting on my best glassy grin, I said that I did know him. And I did go on to get the job. Talking subsequently, we both eventually recalled that Mark had actually visited our flat once, very briefly, with Tony, and we’d met fleetingly. That would have been five years or so earlier.

About nine months into my work with company B I saw a job advertised in the paper while I was on the commuter train. I hadn’t been looking for a job, and the ad just happened to catch my eye as I turned the page. It was with a small company (company C), with requirements very relevant to what I was currently doing, and sounding really attractive – so I applied. While I was awaiting the outcome of this, I heard that my present employer, company B, was to stop investing in my current area of work, and I was moved to a different position. I didn’t like the new job at all, and so of course was pinning my hopes on the application I’d already made. However, oddly, the job I’d been given involved being relocated into a different building, and I was given an office with a window directly overlooking the building in which company C was based.

This seemed a good omen – and I subsequently was given an interview, and then a second one, with directors of company C. On the second one, my interviewer, ‘Tim Newcombe’, seemed vaguely familiar, but I couldn’t place him and thought no more of it. He evidently didn’t know me. Once again, I got the job: apparently it had been a close decision between me and one other applicant, from a field of about 50. And it wasn’t long before I found out why Tim seemed familiar: he was in fact married to someone I knew well in connection with some voluntary work I was involved with. On one occasion, I eventually realised, I had visited her house with some others and had very briefly met Tim. I went on to work for company C for nearly 12 years, until it disbanded. Subsequent to this both Tim and I worked on our own accounts, and we collaborated on a number of projects.

So far, therefore, two successive jobs where, for each, I was interviewed by someone whom I eventually realised I had already met briefly, and who had a strong connection to someone I knew. (In neither case was the connection related to the area of work, so that isn’t an explanation.)

The saga continues

A year or two after leaving company B, I heard that Mark Cooper had moved to a new job in company D, and in fact visited him there once in the line of work. Meanwhile, ten years after I had started the job in company C – and while I was still doing it – my wife and I, wanting to move to a new area, found and bought a house there (where we still live now, more than 20 years later). I then found out that the previous occupants were leaving because the father of the family had a new job – with, it turned out, company D. And on asking him more about it, it transpired that he was going to work with Mark Cooper, making an extraordinarily neat loop back to the original coincidence in the chain.

I’ve often mused on this striking series of connections, and wondered if I was fated always to encounter some bizarre coincidence every time I started new employment. However, after company C, I worked freelance for some years, and then got a job in a further company (my last before retirement). This time, there was no coincidence that I was aware of. But now, just in the last few weeks, that last job has become implicated in a further unlikely connection. This time it’s my son who has been looking for work. He told me about a promising opportunity he was going to apply for. I had a look at the company website and was surprised to see among the pictures of employees a man who had worked in the same office as me for the last four years or so – from the LinkedIn website I discovered he’d moved on a month after I retired. My son was offered an initial telephone interview – which (almost inevitably) turned out to be with this same man.

In gullible mode, I wondered to myself whether this was another significant coincidence. Well, whether I’m gullible or not, my son did go on to get the job. I hadn’t worked directly with the interviewer in question, and only knew him slightly; I don’t think he was aware of my surname, so I doubt that he realised the connection. My son certainly didn’t mention it, because he didn’t want to appear to be currying favour in any dubious way. And in fact this company that my son now works in turns out to have a historical connection with my last company – which perhaps explains the presence of his interviewer in it. But neither I nor my son were aware of any of this when he first became interested in the job.

Just one more

I’m going to try your patience with just one more of my own examples, and this involves the same son, but quite a few years back – in fact when he was due to be born. At the time our daughter was 2 years old, and if I was to attend the coming birth she would need to be babysat by someone. One friend, who we’ll call Molly, said she could do this if it was at the weekend – so we had to find someone else for a weekday birth. Another friend, Angela, volunteered. My wife finally started getting labour pains, a little overdue, one Friday evening. So it looked as if the baby would arrive over the weekend and Molly was alerted. However, untypically for a second baby, this turned out to be a protracted process. By Sunday the birth started to look imminent, and Molly took charge of my daughter. But by the evening the baby still hadn’t appeared – we had gone into hospital once but were sent home again to wait. So we needed to change plans, and my daughter was taken to Angela, where she would stay overnight.

My son was finally born in the early hours of Monday morning, which was May 8th. And then the coincidence: it turned out that both Molly and Angela had birthdays on May 8th. What’s nice about this one is that it is possible to calculate the odds. There is that often quoted statistic that if there are 23 or more people in a room there is a greater than evens chance that at least two of them will share the same birthday. 23 seems a low number – but I’ve been through the maths myself, and it is so. However in this case, it’s a much simpler calculation: the odds would be 1 in 365 x 365 (ignoring leap years for simplicity), which is 133,225 to 1 against. That’s unlikely enough – but once again, however, I don’t feel that the calculations tell the full story. The odds I’ve worked out apply where any three people are taken at random and found all to share the same birthday. In this case we have the coincidence clustered around a significant event, the actual day of birth of one of them – and that seems to me to add an extra dimension that can’t so easily be quantified.

Malicious streak

Well, there you have it – random chance, or some obscure organising principle beyond our current understanding? Needless to say, that’s speculation which splits opinion along the lines I described in my post about the ‘iPhobia’ concept. As an admitted ‘iclaustrophobe’, I prefer to keep an open mind on it. But to return to Brian Inglis’s ‘joker’: Inglis notes that this imagined character seems to display a malicious streak from time to time: he quotes an example where estranged lovers are brought together by coincidence in awkward, and ultimately disastrous circumstances. And add to that the observation of some of those looking into the coincidence phenomenon that their interest seems to attract further coincidences: when Arthur Koestler was writing about Kammerer he describes his life being suddenly beset by a “meteor shower” of coincidences, as if, he felt, Kammerer were emphasising his beliefs from beyond the grave.

With both of those points in mind, I’d like to offer one further story. It was told to me by Jane O’Grady (real name this time), and I’m grateful to her for allowing me to include it here – and also for going to some trouble to confirm the details. Jane is a writer, philosopher and teacher. One day in late 1991, she and her then husband, philosopher Ted Honderich, gave a lunch to which they invited Brian Inglis. His book on coincidences – the one I’ve just read – had been published fairly recently, and a good part of their conversation was a discussion of that topic. A little over a year later, in early 1993, Jane was teaching a philosophy A-level class. After a half-time break, one of the students failed to reappear. His continuing absence meant that Jane had to give up waiting and carry on without him. He had shown himself to be somewhat unruly, and so this behaviour seemed to her at first to be irritatingly in character.

And so when he did finally appear, with the class nearly over, Jane wondered whether to believe his proffered excuse: he said he had witnessed a man collapsing in the street and had gone to help. But it turned out to be perfectly true. Unfortunately, despite his intervention, nothing could be done and the man had died. The coincidence, as you may have guessed, lay in the identity of the dead man. He was Brian Inglis.


*Brian Inglis, Coincidence: A Matter of Chance – or Synchronicity? Hutchinson, 1990

Old Ladies, Old Gentlemen, Ironmongers and Encyclopaedias

Commuting days until retirement: 61

So far, I haven’t got to see the film I mentioned – the one I prepared for by reading Testament of Youth and writing about it in the last post. Infuriatingly, my local cinema and all the others in the area were showing it only around lunch time on weekdays. Obviously whoever decides these things had put it down as a film for old ladies. If it were a few months further on I’d be retired, and would be able to consider myself an honorary old lady, buttoning up my overcoat and toddling round to the picture house for a couple of hours of sedate entertainment. As it is I’ll probably wait and see it on DVD, Netflix or whatever.

But in fact I’m not at all in the habit of thinking of myself as an old gentleman, let alone an old lady. A few months ago I went to see a medical specialist about a minor but painful condition (happily temporary, as it turned out). I got my copy of his letter to the GP, which started: ‘I saw this 66 year old gentleman…’. My immediate reaction was: ‘Who can he be talking about? Have I got the wrong letter?’

But no – it came home to me that the way the world sees you isn’t often the way you think of yourself. But does anyone now think of themselves as a ‘gentleman’? A hundred years ago they certainly would have done; and the doctor’s letter shows that it’s still a formal, polite way of referring to someone who has at least reached middle age. But to think of yourself in that way seems like divorcing yourself from the contemporary world and consigning yourself to a time-warped existence.

John Carey

John Carey

My current train reading has in fact reminded me all too sharply of how distant in time my origins now are. It’s another piece of autobiography, of the Oxford English professor and literary pundit John Carey – The Unexpected Professor. He’s a good deal older than me – he was born in the 1930s – but there was so much in his early life which struck a chord with me.

I remember, as he does, the sorts of books that we were given by well-meaning adult relatives, with titles like 101 Things a Boy Can Do. (You could also find collections of Things a Girl Could Do, but, needless to say, they were very different.) There’s a particular phrase that always cropped up in those books which has stayed with me ever since. These masculine activities usually involved some sort of construction project which required certain outlandish components which were always ‘obtainable from any good ironmonger for a few pence.’

Reading about Carey’s boyhood, it was with a delighted shock of recognition that I found he had remembered exactly the same phrase. Like me, he wasn’t exactly sure what was meant by ‘ironmonger’. The most likely candidate in my area was a hardware shop called Mence Smith, where my queries about these unlikely items would be met with blank stares. So maybe it was a Bad Ironmonger – if it was an ironmonger at all, that is.

Another memory I share with him from that time is of the encyclopedias of the time; the very word, in the Internet age, has a dated ring to it. The most prosperous households would own the authoritative, and impossibly expensive, Encyclopedia Britannica, but most of the families I knew had a less prestigious set, usually dating from before the war, with grainy black and white illustrations and text that impressed on the reader how clever and enlightened the ‘modern’ world was, and how abjectly primitive our dim and distant forebears. I particularly remember a picture of the very latest in steam locomotives busily chuffing down the main line, with the breathless caption ‘A Mile a Minute!’  Today, sixty miles an hour is the speed of the average motorway dawdler.

And encyclopaedias make me think of a strange, shadowy business movement that was in its heyday some years back. It was based on sales people going from door to door selling encyclopedias – at one time they seem to outnumber even Jehovah’s Witnesses. I once got a telling glimpse of how this worked when, between a college course and a regular job, I was looking for an opportunity to earn something and saw a newspaper ad inviting people to a meeting about some potential work.  Encyclopedia selling was what it turned out to be. A roomful of people was addressed by a sharply suited, rather too plausible sounding character, who asked people for guesses as to what they would earn for selling a single set. There were hesitant tries.  ‘One pound?’  ‘Two pounds?’ (This was the 1960s – you can multiply the amounts by about 16 for today’s prices.) ‘No,’ he eventually said triumphantly. ‘TWENTY pounds!’

Most of his audience were now slavering like Pavlov dogs, as he had intended, and seemed not to have drawn the obvious conclusion that this meant the encyclopedias were very difficult to sell indeed; and given that it was commission only, and that I would probably be the world’s worst salesman, I knew it would dispiriting and dreadful. He invited anyone who was perverse enough still to be uninterested to leave, and only two of us out of 40 or 50 did. I was reflecting what sort of organisation it was that gleefully duped its own employees.

Some years earlier, as a child, I’d had a glimpse of this from the other side. A new set of encyclopedias, The Children’s Britannica, appeared in our house. It seemed that my father, not usually a soft touch, had bought them on the doorstep. Much later on I heard my mother’s account – that it had been an attractive young woman who was selling them. I doubt if he would have succumbed if it had been a man.

Returning to Professor Carey, I’m now about halfway through his book, hearing about his university career and greatly enjoying tracing his steps as he explored the canon of literature. He’s giving me plenty of ideas for future reading. As a memoir it’s altogether more relaxed than Testament of Youth; he has a gently humorous, self-mocking style that’s light years away from the committed, stormy intensity of Vera Brittain. What they have in common is Oxford; what they don’t have in common is close experience of the pain and loss of war. A child in the Second World War, the nearest Carey got to that was a spell of peacetime ‘National Service’ in the army, which everyone of his generation had to do (I was part of the first generation that didn’t have to). So the difference in tone is entirely understandable; but I recently read an interview with Vera Brittain’s daughter, Shirley Williams, in which she admitted that her mother had no great sense of humour.

And, come to think of it, there’s a significant point of contrast between Shirley Williams and her near contemporary John Carey. Carey makes pointed references in his foreword, and repeatedly in the book, to his utter disapproval of the closing down of grammar schools in the UK. (For those unused to the confusing English terminology, grammar schools are state funded and select by ability, while ‘public schools’, referred to below, are privately funded independent schools where most richer people send their children.)

One thing that has not changed is that Oxford – and Cambridge – still take vastly disproportionate numbers of public-school students. This is often blamed on Oxford and Cambridge. The blame, however, lies with those who destroyed the grammar schools. Selecting for merit, not money, the grammar schools, had they survived, would by now have all but eliminated the public-school contingent in Oxford and Cambridge, with far-reaching effects on our society. This book is, among other things, my tribute of gratitude to a grammar school.

Shirley Williams

Shirley Williams

It was the Labour administrations of the 60s and 70s who put this policy in place, with egalitarian intentions but a strange failure to anticipate the consequences. Shirley Williams, who was a minister of education in the 70s, had a central role in implementing the policy, and still strongly believes it to have been right. Carey, as the above passage shows, was a working class boy – no ivory tower elitist but a perfect exemplar of those who benefited from the grammar school system. One scene from his book nicely encapsulates his outlook: during his spell in the army he has a trip in a small plane in Egypt. The pilot makes a detour to give the passengers an aerial view of the pyramids. Rather than being bowled over by their grandeur, Carey is thinking of the slaves who built them, and the tendency of humanity throughout history to separate itself into a pampered elite and huge, suffering underclass.

So in Carey and Williams we have two open, attractive personalities, both with strongly expressed, left-leaning views, who are diametrically opposed over this point. I’d like to hear them debate it (I’m with Carey).

One final, unconnected point. My reading between the two autobiographical works of Vera Brittain and John Carey was David Mitchell’s latest novel The Bone Clocks. Both the two memoirs look back to earlier periods of scarcer resources and greater austerity. The Bone Clocks ends in 2043, when climate change and dwindling energy sources are eating away at the comforts we now take for granted, and the characters are wistfully remembering an era of greater plenty. I’m hoping for a long retirement, of course – but perhaps not too long.

The Unsubmerged City

Commuting days until retirement: 77

To continue the First World War theme from the previous post, I saw that a film based on Vera Brittain’s classic memoir of her early life – Testament of Youth, with that war as the central and dominating event – was soon to be released. I’ve often heard of the book but have never read it, and much prefer to see a book-based film only after having been able to immerse myself in the atmosphere of the book itself. So I’ve now done that, and am very glad that I did. It recreates the reality of that distant period in a way that could only have been managed by a writer who experienced both the best and worst that it had to offer.

Love without dignity

We first meet Vera Brittain as a girl growing up in the rather stultifying atmosphere of a middle class Edwardian household. Meeting some of her brother’s school friends might be an opening to the expanding possibilities of life, but:

The parental habit – then almost universally accepted as ‘correct’ where daughters were concerned – of inquisition into each day’s proceedings made private encounters, even with young men in the same town, almost impossible without a whole series of intrigues and subterfuges which robbed love of all its dignity.

Eventually however she does fall in love with Roland Leighton, one of the group of friends, and an especially brilliant literature and classics scholar who scoops almost all the school prizes available to him. But the experience of a closening relationship was then very different to today’s typical expectations:

We sat on the sofa till midnight, talking very quietly. The stillness, heavy-laden with the dull oppression of the snowy night, became so electric with emotion that we were frightened of one another, and dared not let even our fingers touch for fear that the love between us should render what we both believed to be decent behaviour suddenly unendurable….
I was still incredibly ignorant. I had read, by then, too much to have failed to acquire a vague and substantially correct idea of the meaning of marriage, but I did not yet understand the precise nature of the act of union. My ignorance, however, was incapable of disturbing my romantic adoration, for I knew now for certain that whatever marriage might involve in addition to my idea of it, I could not find it other than desirable.

Vera Brittain

Vera Brittain as a VAD (illustration from book)

But by this point – early 1915 – the war is under way, and soon Roland, as well as her brother Edward and other friends, are away in military training, eventually to be involved in action as officers. Vera has already gone up to Oxford, women since 1876 having been able to study at University, but not – bizarrely to our modern minds – able to take degrees. (Having returned to study after the war she became one of the first who did.) But feeling she must share the experiences of those she loves in one of the few ways that she can, by summer of that year she is becoming a VAD (Voluntary Aid Detachment) nursing assistant – one of the women drafted in to nurse the wounded in that war. With minimal training, they were pitched into dealing with men who were often dying in front of their eyes, many with wounds that would be a challenge for the most experienced nurse.

In addition, of course she is unversed in practical tasks in ways that many middle class girls of that were: she describes how she needed instruction in how to boil an egg. And of course more importantly for her work, there are other areas of ignorance:

Throughout my two decades of life, I had never looked upon the nude body of an adult male; I had never even seen a naked boy-child since the nursery days when, at the age of four or five, I used to share my evening baths with Edward. I had therefore expected, when I first started nursing, to be overcome with nervousness and embarrassment, but, to my infinite relief, I was conscious of neither. Towards the men I came to feel an almost adoring gratitude for their simple and natural acceptance of my ministrations. Short of actually going to bed with them, there was hardly an intimate service that I did not perform for one or another in the course of four years, and I still have reason to be thankful for the knowledge of masculine functioning which the care of them gave me, and for my early release from the sex-inhibitions that even to-day – thanks to the Victorian tradition which up to 1914 dictated that a young woman should know nothing of men but their faces and their clothes until marriage pitchforked her into an incompletely visualised and highly disconcerting intimacy – beset many of my female contemporaries, both married and single.

The reality of war

We see how the war explosively disrupted the hardened attitudes of the time in so many fundamental ways; but of course the core of Vera’s experience was that of death – at first hand in her nursing work, and fearfully anticipated in relation to her fiancé, brother and friends. Letters are nervously sent to, and received from, the front, and we experience her emotional swings as good and bad news of the fighting is received. As many must have done, they agree on coded phrases which will bypass censorship and give those at home clues to what is happening.

Roland anticipates that he will get through the war, nevertheless feeling it would be fitting to have received some wound as a token of what he has been through. But, as soon as Christmas 1915, Vera hears that he has died in action, shot by an enemy sniper. Numbly, she buries herself in her nursing work for the rest of the war. Her two closest male friends, members of the group formed at school with Roland and Edward, both succumb in turn, one being killed outright, and the other, Victor Richardson, blinded and brought back to hospital to recuperate, but then dying of his wounds. Her brother Edward, perhaps the quietest of the group, goes on to show great courage and wins the Military Cross. But finally, in 1918, he also dies in the fighting on the Austro-Italian border.

Brittain, Leighton Richardson

Edward Brittain, Roland Leighton and another friend, Victor Richardson (illustration from book)

I found Vera Brittain’s writing sometimes has something of the verbose, circumlocutory quality of the Victorian tradition she has inherited; but when describing the War period, driven by such enormous emotional stresses, it becomes more direct, powerful and evocative. At the same time some of the photographs included in the book brought home to me as much as anything what it must have felt like to live through that time. Not pictures of fighting or of the wounded, but simply of Vera’s brother and friends posed in relaxed groups for the camera: first in school dress at Uppingham, then in military uniform, later with freshly sprouted moustaches; the sequence then ends abruptly and shockingly with photographs of their graves.

The pacifist’s task

In the 1920s we see Vera working for what was then the League of Nations, and throwing herself into pacifist causes – but it’s a nuanced and intelligent pacifism. She writes of the heroism that war can draw out:

It is, I think, this glamour, this magic, this incomparable keying up of the spirit in a time of mortal conflict, which constitute the pacifist’s real problem – a problem still incompletely imagined, and still quite unsolved. The causes of war are always falsely represented; its honour is dishonest and its glory meretricious, but the challenge to spiritual endurance, the intense sharpening of all the senses, the vitalising consciousness of common peril for a common end, remain to allure those boys and girls who have just reached the age when love and friendship and adventure call more persistently than at any later time. The glamour may be the mere delirium of fever, which as soon as war is over dies out and shows itself for the will-o’-the-wisp that it is, but while it lasts no emotion known to man seems as yet to have quite the compelling power of this enlarged vitality…
Since those years it has often been said by pacifists… that war creates more criminals than heroes; that, far from developing noble qualities in those who take part in it, it brings out only the worst. If this were altogether true, the pacifist’s aim would be, I think, much nearer of attainment than it is. Looking back upon the psychological processes of us who were very young sixteen years ago, it seems to me that his task – our task – is infinitely complicated by the fact that war, while it lasts, does produce heroism to a far greater extent than it brutalises.

I’m lucky never to have been involved in a war, and have no idea whether I could have coped with the experience at all. But this sums up for me the paradox of how it can bring out qualities of bravery and selflessness in those who might never have been called upon to show them, were it not for the bumbling of politicians and the posturing of dictators. And perhaps that paradox was more painfully sharp in World War 1 than any other war before or since.

Brittain travels around Europe as part of her work, and experiences the festering resentment brought about by the post-war settlements, realising presciently that another war is a possibility, and wondering sadly what sort of cause it was for which those she loved had died.

War and literature

But perhaps one of the important themes of the book is the fight to preserve the life of literature in the face of the rampant destructiveness of war. There is Vera’s own underlying ambition to write, set against her war work, all-encompassing in time, energy and emotion; as well as her almost-missed university education. But more glaringly obvious is the cutting short of thousands of promising, talented lives such as Roland’s. And her brother Edward had musical ability and enthusiasm which he was never able to develop further.

As the War gets under way and Vera’s friends are sent away, letters between them include poems and other writing – their own as well as that of others. In 1915 Vera sends to Roland a leading article she has clipped from The Times newspaper, whose title I have borrowed for this post. In the book she quotes a passage:

A medieval fancy that still lingers, ghost-like, on the more lonely sea-shores, such as that Breton one so tenderly described by Renan, is the legend of the submerged city. It lies out there barely hidden under the waves, and on a still summer eve they say you may hear the music of its Cathedral bells. One day the waters will recede and the city in all its old beauty be revealed again. Might this not serve to figure the actual conditions of literature, in the nobler sense of the term, submerged as that seems to many to be by the high tide of war? Thus submerged it seemed, at any rate, to the most delicate of our literary artists, who was lately accounting for his disused pen to an aggrieved friend. ‘I have no heart,’ he said, ‘for literature in this war; we can only have faith that it is still there under the waters, and will some day re-emerge.’ . . . There is fortunately no truth in the idea of a sunken literature. A function of the spirit, it can never be submerged, or, indeed, as much as touched by war or any other external thing. It is an inalienable possession and incorruptible part of man.

And of course against whatever literature we might imagine never appeared, because of the destruction of those who would have created it, the war generated a whole body of work which would not otherwise have existed – Brittain’s Testament of Youth being one example.

Je suis CharlieOne of the reasons that I was struck by that ‘unsubmerged city’ passage was that I read it at about the same time that the Charlie Hebdo murders were committed in Paris. A hundred years on, we may be dealing with an entirely different situation and a kind of literature undreamt of in those earlier years, but compare these early 20th century sentiments to the ardent faces of the crowds waving pens and pencils in response to the shootimg of the journalists and cartoonists.

While moved by the Parisian expressions of feeling, I couldn’t help thinking at the same time of the far greater atrocities committed recently by Islamic extremists in Nigeria and Pakistan – not to mention Syria and Iraq. The Western media devoted far more space to Charlie Hebdo than to these – perhaps understandably, since they are closer to home and threaten our own Western culture. But I hope and expect that the floods of ignorance, fanaticism and brutality will not in the end submerge the metaphorical cities which form the true and established cultures of those other, more distant places. They are surely under a far greater threat than our own.

The Mathematician and the Surgeon

Commuting days until retirement: 108

After my last post, which, among other things, compared differing attitudes to death and its aftermath (or absence of one) on the part of Arthur Koestler and George Orwell, here’s another fruitful comparison. It seemed to arise by chance from my next two commuting books, and each of the two people I’m comparing, as before, has his own characteristic perspective on that matter. Unlike my previous pair both could loosely be called scientists, and in each case the attitude expressed has a specific and revealing relationship with the writer’s work and interests.

The Mathematician

The first writer, whose book I came across by chance, has been known chiefly for mathematical puzzles and games. Martin Gardner was born in Oklahoma USA in 1914; his father was an oil geologist, and it was a conventionally Christian household. Although not trained as a mathematician, and going into a career as a journalist and writer, Gardner developed a fascination with mathematical problems and puzzles which informed his career – hence the justification for his half of my title.

Martin Gardner

Gardner as a young man (Wikimedia)

This interest continued to feed the constant books and articles he wrote, and he was eventually asked to write the Scientific American column Mathematical Games which ran from 1956 until the mid 1980s, and for which he became best known; his enthusiasm and sense of fun shines through the writing of these columns. At the same time he was increasingly concerned with the many types of fringe beliefs that had no scientific foundation, and was a founder member of PSICOPS,  the organisation dedicated to the exposing and debunking of pseudoscience. Back in February last year I mentioned one of its other well-known members, the flamboyant and self-publicising James Randi. By contrast, Gardner was mild-mannered and shy, averse from public speaking and never courting publicity. He died in 2010, leaving behind him many admirers and a two-yearly convention – the ‘Gathering for Gardner‘.

Before learning more about him recently, and reading one of his books, I had known his name from the Mathematical Games column, and heard of his rigid rejection of things unscientific. I imagined some sort of skinflint atheist, probably with a hard-nosed contempt for any fanciful or imaginative leanings – however sane and unexceptionable they might be – towards what might be thought of as things of the soul.

How wrong I was. His book that I’ve recently read, The Whys of a Philosophical Scrivener, consists of a series of chapters with titles of the form ‘Why I am not a…’ and he starts by dismissing solipsism (who wouldn’t?) and various forms of relativism; it’s a little more unexpected that determinism also gets short shrift. But in fact by this stage he has already declared that

I myself am a theist (as some readers may be surprised to learn).

I was surprised, and also intrigued. Things were going in an interesting direction. But before getting to the meat of his theism he spends a good deal of time dealing with various political and economic creeds. The book was written in the mid 80s, not long before the collapse of communism, which he seems to be anticipating (Why I am not a Marxist) . But equally he has little time for Reagan or Thatcher, laying bare the vacuity of their over-simplistic political nostrums (Why I am not a Smithian).

Soon after this, however, he is striding into the longer grass of religious belief: Why I am not a Polytheist; Why I am not a Pantheist; – so what is he? The next chapter heading is a significant one: Why I do not Believe the Existence of God can be Demonstrated. This is the key, it seems to me, to Gardner’s attitude – one to which I find myself sympathetic. Near the beginning of the book we find:

My own view is that emotions are the only grounds for metaphysical leaps.

I was intrigued by the appearance of the emotions in this context: here is a man whose day job is bound up with his fascination for the powers of reason, but who is nevertheless acutely conscious of the limits of reason. He refers to himself as a ‘fideist’ – one who believes in a god purely on the basis of faith, rather than any form of demonstration, either empirical or through abstract logic. And if those won’t provide a basis for faith, what else is there but our feelings? This puts Gardner nicely at odds with the modish atheists of today, like Dawkins, who never tires of telling us that he too could believe if only the evidence were there.

But at the same time he is squarely in a religious tradition which holds that ultimate things are beyond the instruments of observation and logic that are so vital to the secular, scientific world of today. I can remember my own mother – unlike Gardner a conventional Christian believer – being very definite on that point. And it reminds me of some of the writings of Wittgenstein; Gardner does in fact refer to him,  in the context of the freewill question. I’ll let him explain:

A famous section at the close of Ludwig Wittgenstein’s Tractatus Logico-Philosophicus asserts that when an answer cannot be put into words, neither can the question; that if a question can be framed at all, it is possible to answer it; and that what we cannot speak about we should consign to silence. The thesis of this chapter, although extremely simple and therefore annoying to most contemporary thinkers, is that the free-will problem cannot be solved because we do not know exactly how to put the question.

This mirrors some of my own thoughts about that particular philosophical problem – a far more slippery one than those on either side of it often claim, in my opinion (I think that may be a topic for a future post). I can add that Gardner was also on the unfashionable side of the question which came up in my previous post – that of an afterlife; and again he holds this out as a matter of faith rather than reason. He explores the philosophy of personal identity and continuity in some detail, always concluding with the sentiment ‘I do not know. Do not ask me.’ His underlying instinct seems to be that there has to something more than our bodily existence, given that our inner lives are so inexplicable from the objective point of view – so much more than our physical existence. ‘By faith, I hope and believe that you and I will not disappear for ever when we die.’ By contrast, Arthur Koestler, you may remember,  wrote in his suicide note of ‘tentative hopes for a depersonalised afterlife’ – but, as it turned out, these hopes were based partly on the sort of parapsychological evidence which was anathema to Gardner.

And of course Gardner was acutely aware of another related mystery – that of consciousness, which he finds inseparable from the issue of free will:

For me, free will and consciousness are two names for the same thing. I cannot conceive of myself being self-aware without having some degree of free will… Nor can I imagine myself having free will without being conscious.

He expresses utter dissatisfaction with the approach of arch-physicalists such as Daniel Dennett, who,  as he says,  ‘explains consciousness by denying that it exists’. (I attempted to puncture this particular balloon in an earlier post.)

Martin Gardner

Gardner in later life (Konrad Jacobs / Wikimedia)

Gardner places himself squarely within the ranks of the ‘mysterians’ – a deliberately derisive label applied by their opponents to those thinkers who conclude that these matters are mysteries which are probably beyond our capacity to solve. Among their ranks is Noam Chomsky: Gardner cites a 1983 interview with the grand old man of linguistics,  in which he expresses his attitude to the free will problem (scroll down to see the relevant passage).

The Surgeon

And so to the surgeon of my title, and if you’ve read one of my other blog posts you will already have met him – he’s a neurosurgeon named Henry Marsh, and I wrote a post based on a review of his book Do No Harm. Well, now I’ve read the book, and found it as impressive and moving as the review suggested. Unlike many in his profession, Marsh is a deeply humble man who is disarmingly honest in his account about the emotional impact of the work he does. He is simultaneously compelled towards,  and fearful of, the enormous power of the neurosurgeon both to save and to destroy. His narrative swings between tragedy and elation, by way of high farce when he describes some of the more ill-conceived management ‘initiatives’ at his hospital.

A neurosurgical operation

A neurosurgical operation (Mainz University Medical Centre)

The interesting point of comparison with Gardner is that Marsh – a man who daily manipulates what we might call physical mind-stuff – the brain itself – is also awed and mystified by its powers:

There are one hundred billion nerve cells in our brains. Does each one have a fragment of consciousness within it? How many nerve cells do we require to be conscious or to feel pain? Or does consciousness and thought reside in the electrochemical impulses that join these billions of cells together? Is a snail aware? Does it feel pain when you crush it underfoot? Nobody knows.

The same sense of mystery and wonder as Gardner’s; but approached from a different perspective:

Neuroscience tells us that it is highly improbable that we have souls, as everything we think and feel is no more or no less than the electrochemical chatter of our nerve cells… Many people deeply resent this view of things, which not only deprives us of life after death but also seems to downgrade thought to mere electrochemistry and reduces us to mere automata, to machines. Such people are profoundly mistaken, since what it really does is upgrade matter into something infinitely mysterious that we do not understand.

Henry Marsh

Henry Marsh

This of course is the perspective of a practical man – one who is emphatically working at the coal face of neurology, and far more familiar with the actual material of brain tissue than armchair speculators like me. While I was reading his book, although deeply impressed by this man’s humanity and integrity, what disrespectfully came to mind was a piece of irreverent humour once told to me by a director of a small company I used to work for which was closely connected to the medical industry. It was a sort of a handy cut-out-and-keep guide to the different types of medical practitioner:

Surgeons do everything and know nothing. Physicians know everything and do nothing. Psychiatrists know nothing and do nothing.  Pathologists know everything and do everything – but the patient’s dead, so it’s too late.

Grossly unfair to all to all of them, of course, but nonetheless funny, and perhaps containing a certain grain of truth. Marsh, belonging to the first category, perhaps embodies some of the aversion from dry theory that this caricature hints at: what matters to him ultimately, as a surgeon, is the sheer down-to-earth physicality of his work, guided by the gut instincts of his humanity. We hear from him about some members of his profession who seem aloof from the enormity of the dangers it embodies, and seem able to proceed calmly and objectively with what he sees almost as the detachment of the psychopath.

Common ground

What Marsh and Gardner seem to have in common is the instinct that dry, objective reasoning only takes you so far. Both trust the power of their own emotions, and their sense of awe. Both, I feel, are attempting to articulate the same insight, but from widely differing standpoints.

Two passages, one from each book, seem to crystallize both the similarities and differences between the respective approaches of the two men, both of whom seem to me admirably sane and perceptive, if radically divergent in many respects. First Gardner, emphasising in a Wittgensteinian way how describing how things appear to be is perhaps a more useful activity than attempting to pursue any ultimate reasons:

There is a road that joins the empirical knowledge of science with the formal knowledge of logic and mathematics. No road connects rational knowledge with the affirmations of the heart. On this point fideists are in complete agreement. It is one of the reasons why a fideist, Christian or otherwise, can admire the writings of logical empiricists more than the writings of philosophers who struggle to defend spurious metaphysical arguments.

And now Marsh – mystified, as we have seen, as to how the brain-stuff he manipulates daily can be the seat of all experience – having a go at reading a little philosophy in the spare time between sessions in the operating theatre:

As a practical brain surgeon I have always found the philosophy of the so-called ‘Mind-Brain Problem’ confusing and ultimately a waste of time. It has never seemed a problem to me, only a source of awe, amazement and profound surprise that my consciousness, my very sense of self, the self which feels as free as air, which was trying to read the book but instead was watching the clouds through the high windows, the self which is now writing these words, is in fact the electrochemical chatter of one hundred billion nerve cells. The author of the book appeared equally amazed by the ‘Mind-Brain Problem’, but as I started to read his list of theories – functionalism, epiphenomenalism, emergent materialism, dualistic interactionism or was it interactionistic dualism? – I quickly drifted off to sleep, waiting for the nurse to come and wake me, telling me it was time to return to the theatre and start operating on the old man’s brain.

I couldn’t help noticing that these two men – one unconventionally religious and the other not religious at all – seem between them to embody those twin traditional pillars of the religious life: faith and works.

A Singular Notion

Commuting days until retirement: 168

I’ve been reading about the future. Well, one man’s idea of the future, anyway – and of course when it comes to the future, people’s ideas about it are really all we can have. This particular writer obviously considers his own ideas to be highly upbeat and optimistic, but others may view them with apprehension, if not downright disbelief – and I share some of their reservations.

Ray Kurzweil

Ray Kurzweil
(Photo: Roland Dobbins / Wikimedia Commons)

The man in question is Ray Kurzweil, and it has to be said that he is massively well informed – about the past and the present, anyway; his claims to knowledge of the future are what I want to examine. He holds the position of Head of Engineering at Google, but has also founded any number of high-tech companies and is credited with a big part in inventing flatbed scanners, optical character recognition, speech synthesis and speech recognition. On top of all this, he is quite a philosopher, and has carried on debates with other philosophers about the basis of his ideas, and we hear about some of these debates in the book I’ve been reading.

The book is The Singularity is Near, and its length (500 dense pages, excluding notes) is partly responsible for the elapsed time since my last substantial post. Kurzweil is engagingly enthusiastic about his enormous stock of knowledge,  so much so that he is unable to resist laying the exhaustive details of every topic before you. Repeatedly you find yourself a little punch drunk under the remorseless onslaught of facts – at which point he has an engaging way of saying ‘I’ll be dealing with that in more detail in the next chapter.’ You feel that perhaps quite a bit of the content would be better accommodated in endnotes – were it not for the fact that nearly half the book consists of endnotes as it is.

Density

To my mind, the the argument of the book has two principal premises, the first of which I’d readily agree to, but the second of which seems to me highly dubious. The first idea is closely related to the ‘Singularity’ of the title. A singularity is a concept imported from mathematics, but is perhaps more familiar in the context of black holes and the big bang. In a black hole, enormous amounts of matter become so concentrated under their own gravitational force that they shrink to a point of, well, as far as we can tell, infinite density. (At this point I can’t help thinking of Kurzweil’s infinitely dense prose style – perhaps it is suited to his topic.) But what’s important about this for our present purposes is the fact that some sort of boundary has been crossed: things are radically different, and all the rules and guidelines that we have previously found useful in investigating how the world works, no longer apply.

To understand how this applies, by analogy,  to our future, we have to introduce the notion of exponential growth – that is, growth not by regular increments but by multiples. A well known illustration of the surprising power of this is the old fable of the King who has a debt of gratitude to one of his subjects, and asks what he would like as a reward. The man asks for one grain of wheat corresponding to the first square of the chess board, two for the second, four for the third,  and so on up to the sixty-fourth, doubling each time. At first the King is incredulous that the man has demanded so little, but of course soon finds that the entire output of his country would fall woefully short of what is asked. (The number of grains works out at 18,446,744,073,709,551,615 – this is of a similar order to,  say,  the estimated number of grains of sand in the world.)

Such unexpected expansion is the hallmark of exponential growth – however gradually it rises at first, eventually the curve will always accelerate explosively upward. Kurzweil devotes many pages to arguing how the advance of human technical capability follows just such a trajectory. One frequently quoted example is what has become known as Moore’s law: in 1965 a co-founder of the chip company Intel, Gordon Moore, made an extrapolation from what had then been achieved and asserted that the number of processing elements that could be fitted on to a chip of a given size would double every year.  This was later modified to two years, but has nevertheless continued exponentially, and there is no reason, short of global calamity, that it will stop in the foreseeable future. The evidence is all around us: thirty years ago, equipment with the power of a modern smartphone would have been a roomful of immobile cabinets costing thousands of pounds.

Accelerating returns

That’s of course just one example; taking a broader view we could look, as Kurzweil does, at the various revolutions that have transformed human life over time. The stages of agricultural revolution – the transition from the hunter-gatherer way of life,  via subsistence farming to systematic growth and distribution of food,  took many centuries,  or even millennia. The industrial revolution could be said to have been even greater in its effects over a mere century or two, while the digital revolution we are currently experiencing has made radical changes in just the last thirty years or so. Kurzweil argues that each of these steps forward provides us with the wherewithal to effect further changes even more rapidly and efficiently – and hence the exponential nature of our progress. Kurzweil refers to it as ‘The Law of Accelerating Returns’.

So if we are proceeding by ever-increasing steps forward, what is our destiny – what will be the nature of the exponential explosion that we must expect? This is the burden of Kurzweil’s book, and the ‘singularity’ after which nothing will be the same. His projection of our progress towards this point is based on a triumvirate of endeavours which he refers to confidently with the acronym GNR: Genetics, Nanotechnology and Robotics. Genetics will continue its progress – exponentially – in finding cures for the disorders which limit our life span, as well as disentangling many of the mysteries of how we – and our brains – develop. For Nanotechnology,  Kurzweil has extensive expectations. Tiny, ultimately self-reproducing machines could be sent out into the world to restructure matter and turn innocent lumps of rock into computers with so far undreamt of processing power. And they could journey inwards,  into our bodies, ferreting out cancer cells and performing all sorts of repairs that would be difficult or impossible now. Kurzweil’s enthusiasm reaches its peak when he describes these microscopic helpers travelling round the blood vessels of our brains, scanning their surroundings and reporting back over wi-fi on what they find. This would be part of the grand project of ‘reverse engineering the brain’.

And with the knowledge gained thus, the third endeavour,  Robotics, already enlisted in the development of the nanobots now navigating our brains, would come into its own. Built on many decades of computing experience,  and enhanced by an understanding of how the human brain works, a race of impossibly intelligent robots, which nevertheless boast human qualities, would be born. Processing power is still of course expanding exponentially, adopting any handy lumps of rock as its substrate, and Kurzweil sees it expanding across the universe as the possibilities of our own planet are exhausted.

Cyborgs

And so what of we poor, limited humans? We don’t need to be left behind,  or disposed of somehow by our vastly more capable creations, according to Kurzweil. Since the functionality of our brains, both in general and on an individual basis, can be replicated within the computing power which is all around us, he envisages us enhancing ourselves by technology. Either we develop the ability to ‘upload the patterns of an actual human into a suitable non-biological, thinking substrate’, or we could simply continue the development of devices like neural implants until nanotechnology is actively extending and even replacing our biological faculties. ‘We will then be cyborgs,’ he explains, and ‘the nonbiological portion of our intelligence will expand its powers exponentially.’

If some of the above makes you feel distinctly queasy, then you’re not alone. A number of potential problems, even disasters, will have occurred to you. But Kurzweil is unfailingly upbeat; while listing a number of ways that things could go wrong, he reasons that all of them can be avoided. And in a long section at the end he lists many objections by critics and provides answers to all of them.

Meanwhile, back in the future, the singularity is under way; and perhaps the most surprising aspect of it is how soon Kurzweil sees it happening. Basing his prediction on an exhaustive analysis, he sets it at: 2045. Not a typo on my part,  but a date well within the lifetime of many of us. I’ll be 97 by then if I’m alive at all,  which I don’t expect to be, exponential advances in medicine notwithstanding. It so happens that Kurzweil himself was born in the same year as me; and as you might expect, this energetic man fully expects to see the day – indeed to be able to upload himself and continue into the future. He tells us how, once relatively unhealthy and suffering from type II diabetes, he took himself in hand ‘from my perspective as an inventor’. He immersed himself in the medical literature, and with the collaboration of a medical expert, aggressively applied a range of therapies to himself. At the time of writing the book, he proudly relates, he was taking 250 supplement pills each day and a half-dozen intravenous nutritional therapies per week. As a result he was judged to have attained a biological age of 40, although he was then 56 in calendar years.

This also brings us to the second – to my mind rather more dubious – plank upon which his vision of the future rests. As we have seen, the best prospects for humanity, he claims,  lie not in the messy and unreliable biological packages which have taken us thus far, but as entities somehow (dis)embodied in the substrate of the computing power which is expanding to fill ever more of the known universe.

Dialogue

Before examining this proposition further, I’d like to mention that, while Kurzweil’s book is hard going at times, it does have some refreshing touches. One of these is the frequent dialogues introduced at the end of chapters, where Kurzweil himself (‘Ray’) discusses the foregoing material with a variety of characters. These include,  among others, a woman from the present day and her uploaded self from a hundred years hence, as well as various luminaries from the past and present: Ned Ludd (the original Luddite from the 18th century), Charles Darwin, Sigmund Freud and Bill Gates. One rather nicely conceived one involves a couple of primordial bacteria discussing the pros and cons of clumping together and giving up some of their individuality in order to form larger organisms; we are implicitly invited to compare the reluctance of one of them to enter a world full of greater possibilities with our own apprehension about the singularity.

So in the same spirit, I have taken the opportunity here to discuss the matter with Kurzweil directly, and I suppose I am going to be the present day equivalent of the reluctant bacterium. (Most of the claims he makes below are not put into his mouth by me, but come from the book.)

DUNCOMMUTIN: Ray,  thank you for taking the trouble to visit my blog.

RAY: That’s my pleasure.

DUNCOMMUTIN: In the book you provide answers to a number of objections – many of them technically based ones which address whether the developments you outline are possible at all. I’ll assume that they are, but raise questions about whether we should really want them to happen.

RAY: OK. You won’t be the first to do that – but fire away.

DUNCOMMUTIN: Well, “fire away” is an apt phrase to introduce my first point: you have some experience of working on defence projects, and this is reflected in some of the points you make in the book. At one point you remark that ‘Warfare will move toward nanobot-based weapons, as well as cyber-weapons’. With all this hyper-intelligence at the service of our brains, won’t some of it reach the conclusion that war is a pretty stupid way of conducting things?

RAY: Yes – in one respect you have a point. But look at the state of the world today. Many people think that the various terrorist organisations that are gaining ever higher profiles pose the greatest threat to our future. Their agendas are mostly based on fanaticism and religious fundamentalism. I may be an optimist, but I don’t see that threat going away any time soon. Now there are reasoned objections to the future that I’m projecting, like your own – I welcome these, and view such debate as important.  But inevitably there will be those whose opposition will be unreasonable and destructive. Most people today would agree that we need armed forces to protect our democracy and,  indeed, our freedom to debate the shape of our future. So it follows that,  as we evolve enhanced capabilities, we should exploit them to counter those threats. But going back to your original point – yes,  I have every hope that the exponentially increasing intelligence we will have access to will put aside the possibility of war between technologically advanced nations. And indeed,  perhaps the very concept of a nation state might eventually disappear.

DUNCOMMUTIN: OK,  that seems reasonable. But I want to look further at the notion of each of us being part of some pan-intelligent entity. There are so many potential worries here. I’ll leave aside the question of computer viruses and cyber-warfare, which you deal with in the book. But can you really see this future being adopted wholesale? Before going into some of the reservations I have, I’d want to say that many will share them.

RAY: Imagine that we have reached that time – not so far in the future. I and like-minded people will be already taking advantage of the opportunities to expand our intelligence, while,  if I may say so, you and your more conservative-minded friends will have not. But expanded intelligence makes you a better debater.  Who do you think will win the argument?

DUNCOMMUTIN: Now you’re really worrying me. Being a better debater isn’t the same as being right. Isn’t this just another way of saying ‘might is right’ – the philosophy of the dictator down the ages?

RAY: That’s a bit unfair – we’re not talking about coercion here, but persuasion – a democratic concept.

DUNCOMMUTIN: Maybe, but it sounds very much as if, with all this overwhelming computer power, persuasion will very easily become coercion.

RAY: Remember that it is from the most technologically advanced nations that these developments will be initiated – and they are democracies. I see democracy and the right of choice being kept as fundamental principles.

DUNCOMMUTIN: You might,  Ray – but what safeguards will we have to retain freedom of choice and restrain any over-zealous technocrats? However I won’t pursue this line further. Here’s another thing that bothers me.  There’s an old saying: ‘To err is human,  but it takes a computer to really foul things up.’ If you look at the history of recent large scale IT projects, particularly in the public sector, you will come across any number of expensive flops that had to be abandoned. Now what you are proposing could be described, it seems to me, as the most ambitious IT project yet. What could happen if I commit the functioning of my own brain to a system which turns out to have serious flaws?

RAY: The problems you are referring to are associated with what we will come to see as the embryonic stage – the dark ages, if you will – of computing. It’s important to recognize that the science of computing is advancing by leaps and bounds, and that software exists which assists in the design of further software. Ultimately program design will be the preserve, not of sweaty pony-tailed characters slaving away in front of screens, but of proven self-organising software entities whose reliability is beyond doubt. Once again, as software principles are developed, proven and applied to the design of further software, we will see exponential progression in this area.

DUNCOMMUTIN: That reassures me in one way, but gives me more cause for concern in another. I am thinking of what I call the coffee machine scenario.

RAY: Coffee machine?

DUNCOMMUTIN: Yes.  In the office where I work there are state-of-the-art coffee machines,  fully automated. You only have to touch a few icons on a screen to order a cup of coffee, tea, or other drink just as you like it, with the right proportions of milk,  sugar,  and so on. The drink you specify is then delivered within seconds. The trouble is,  it tastes pretty ghastly, rendering the whole enterprise effectively pointless. What I am suggesting is that, given all the supreme and unimaginably complex technical wizardry that goes into our new existence, it’s going to be impossible for us humans to keep track of where it’s all going; and the danger is that the point will be missed: the real essence of ourselves will be lost or destroyed.

RAY: OK,  I think I see where you’re going. First of all,  let me reassure you that nanoengineered coffee will be better than anything you’ve tasted before! But, to get to the substantial point, you seem a bit vague about what this ‘essence’ is. Remember that what I am envisaging is a full reverse engineering of the human brain, and indeed body. The computation which results would mirror everything we think and feel. How could this fail to include what you see as the ‘essence’? Our brains and bodies are – in essence – computing processes; computing underlies the foundations of everything we care about, and that won’t be changing.

DUNCOMMUTIN: Well, I could find quite a few people who would say that computing underlies everything they hate – but I accept that’s a slightly frivolous comment. To zero in on this question of essence, let’s look at one aspect of human life – sense of humour. Humour comes at least partly under the heading of ’emotion’, and like other emotions,  it involves bodily functions,  most importantly in this case laughing. Everyone would agree that it’s a pleasant and therapeutic experience.

RAY: Let me jump in here to point out that while many bodily functions may no longer be essential in a virtual computation-driven world, that doesn’t mean they have to go. Physical breathing, for example, won’t be necessary, but if we find breathing itself pleasurable, we can develop virtual ways of having this sensual experience. The same goes for laughing.

DUNCOMMUTIN: But it’s not so much the laughing itself,  but what gives rise to it, which interests me. Humour often involves the apprehension of things being wrong,  or other than they should be – a gap between an aspiration and what is actually achieved. In this perfect virtual world, it seems as if such things will be eliminated.  Maybe we will find ourseleves still able to laugh virtually – but have nothing to virtually laugh at.

RAY: You’ll remember how I’ve said in my book that in such a world there will be limitless possibilities when it comes to entertainment and the arts. Virtual or imagined worlds in which anything can happen,  and in which things can go wrong, could be summoned at will. Such worlds could be immersive, and seem utterly real. These could provide all the entertainment and humour you could ever want.

DUNCOMMUTIN: There’s still something missing,  to my mind. Irony,  humour, artistic portrayals, whatever – all these have the power that they do because they are rooted in gritty reality, not in something we know to have been erected as some form of electronic simulation. In the world you are portraying it seems to me that everything promises to have a thinned-out,  ersatz quality – much like the coffee I mentioned a little while back.

RAY: Well if you really feel that way,  you may have to consider whether it’s worth this small sacrifice for the sake of eliminating hunger, disease, and maybe death itself.

DUNCOMMUTIN: Eliminating death – that raises a whole lot more questions, and if we go into them this blog entry will never finish. I have just one more point I would like to put to you: the question of consciousness, and how that can be preserved in a new substrate or mode of existence. I have to say I was impressed to see that,  unlike many commentators, you don’t dodge the difficulty of this question, but face it head-on.

RAY: Thank you. Yes, the difficulty is that, since it concerns subjective experience, this is the one matter that can’t be resolved by objective observation. It’s not a scientific question but a philosophical one – indeed,  the fundamental philosophical question.

DUNCOMMUTIN: Yes – but you still evidently believe that consciousness would transfer to our virtual, disembodied life. You cross swords with John Searle, whose Chinese Room argument readers of this blog will have come across. His view that consciousness is a fundamentally biological function that could not exist in any artificial substrate is not compatible with your envisaged future.

RAY: Indeed. The Chinese Room argument I think is tautologous – a circular argument – and I don’t see any basis for his belief that consciousness is necessarily biological.

DUNCOMMUTIN: I agree with you about the supposed biological nature of consciousness – perhaps for different reasons – but not about the Chinese Room. However there isn’t space to go into that here. What I want to know is, what makes you confident that your virtualised existence will be a conscious one – in other words,  that you will actually have future experiences to look forward to?

RAY: I’m a patternist. That is, it seems to me that conscious experience is an inevitable emergent property of a certain pattern of functionality, in terms of relationships between entities and how they develop over time. Our future technology will be able to map the pattern of these relationships to any degree of detail, and, by virtue of that,  consciousness will be preserved.

DUNCOMMUTIN: This seems to me to be a huge leap of faith. Is it not possible that you are mistaken, and that your transfer to the new modality will effectively bring about your death? Or worse, some form of altered, and not necessarily pleasant, experience?

RAY: On whether there will be any subjective experience at all, if the ‘pattern’ theory is not correct then I know of no other coherent one – and yes,  I’m prepared to stake my future existence on that. On whether the experience will be altered in some way: as I mentioned, we will be able to model brain and body patterns to any degree of detail, so I see no reason why that future experience should not be of the same quality.

DUNCOMMUTIN: Then the big difference is that I don’t see the grounds for having the confidence that you do, and would prefer to remain as my own imperfect,  mortal self. Nevertheless, I wish you the best for your virtual future – and thanks again for answering my questions.

RAY: No problem – and if you change your mind,  let me know.


The book: Kurzweil, Ray: The Singularity is Near,  Viking Penguin, 2005

For more recent material see: www.singularity.com and www.kurzweilai.net

Sharp Compassion

Commuting days until retirement: 260

The wounded surgeon plies the steel
That questions the distempered part;
Beneath the bleeding hands we feel
The sharp compassion of the healer’s art
Resolving the enigma of the fever chart.
   (from T.S.Eliot – East Coker)

Do No HarmA book review prompted this post: the book in question is Do No Harm: Stories of Life, Death and Brain Surgery, by Henry Marsh. I’ve read some more reviews since, and heard the author on the radio. I think I shall be devoting some commuting time to his book in the near future. It’s a painfully honest account of his life as brain surgeon, a calling from which he’s about to retire. Painful, in that he is searchingly candid aboout failures as well as successess, and about the self-doubt which he has experienced throughout his career. In fact his first chapter begins – rather alarmingly, after a life in the profession: ‘I often have to cut into the brain and it is something I hate doing.’

Some examples: we hear what it is like to explain what has happened to a woman patient in whom one of the mishaps which is always a risk in brain surgery has left her paralysed down one side. He tells her he knows from experience that there is a good chance it will improve.

‘I trusted you before the operation,’ she said. ‘Why should I trust you now?’
I had no immediate reply to this and stared uncomfortably at my feet.

He describes visiting a Catholic nursing home which cares for patients with serious brain damage. He recognises some of the names on the doors, and realises them to be former patients of his own for whom things didn’t go as well as he would have hoped.

Marsh’s title Do No Harm is of course an ironic reference to the central tenet of the Hippocratic Oath; but we know that in modern medicine things aren’t as simple as that, and that every operation should only be undergone after a weighing up of the risks and likely downside against the benefits. These are never so stark as in neurosurgery, and, not surprisingly, its sheer stressfulness ranks above that of intervention in other parts of the body. Marsh rather disarmingly says, against the popular conception, that it mostly isn’t as technically complex as many other kinds of surgery – it’s just that there is so much more at stake when errors occur. He quotes an orthopaedic surgeon who attends Marsh himself for a broken leg, and hearing what he does, clearly feels much happier being in orthopaedics. “Neurosurgery,” he says, “is all doom and gloom.”

I have often imagine what it must be like to be wielding a scalpel, poised over a patient’s skin, and then to be cutting into a stomach or a leg – never mind a brain. Of course a long training, learning from the experts and eventually taking your own first steps would give you the confidence to be able to cope with this; but it’s to Marsh’s credit that he has retained such a degree of self-doubt. Indeed, he speculates on the personalities of some of the great neurosurgeons of the past, who didn’t have the technical facilities of today. Advances could often only be made by taking enormous risks, and Marsh imagines that it would sometimes have been necessary for them to insulate themselves against concern for the patient to a point that marks them as having not just overweening self-confidence, but poitively psychopathic personalities.

I’m lucky not to have been under the knife myself very often, and never, thankfully, for the brain. But what I have experienced of the medicine has shown me that personalities which, with their unusual degree of arrogance, verge on the psychopathic, are all too common in the higher reaches of the profession. One who sticks in my memory wasn’t actually treating me: he had been commissioned to supply case histories for a medical computer program that I was involved in producing. This was not neurosurgery, but another area of medicine, in which I was aware that he was pre-eminent. We went through these case histories, some of them very complex, which he explained. “I diagnosed that,” he would say, visibly preening himself. On the whole he was perfectly good to work with, provided you showed a decent amount of deference. But at one point he moved beyond the subject matter, to lay down how he thought the program should work. This was my area, and I could see some problems with what he was saying, and explained politely. There was an icy pause. Although never less than polite, he proceeded to make it clear that nobody, least of all a cypher like me, was authorised to challenge an opinion of his, on anything. (Later I went ahead and did it my way in any case.) This was back in the eighties, nearly 30 years ago. Thinking about him before writing this, I googled him, and sure enough, he’s still out there, working at a major hospital and with a Harley Street practice. He must indeed have had enormous ability to have been at the top of his area for so long; but as a practitioner of the healing art he was, like, I suspect, many others, definitely not cuddly.

On the radio programme I heard, Henry Marsh remarked that, if you accept praise for your successes, you must accept blame for the failures, and that there are the medical practitioners who don’t follow that precept. I suspect that my friend described above would have been one of them; I have never encountered a level of self-regard that was so tangibly enormous. Perhaps we should be thankful that there are those who harness supreme technical skill to such overweening confidence – but you shudder a little at the thought of their scalpel in your brain. If this was to befall me, I’d much rather the knife was wielded by someone of Marsh’s cast of mind, rather than an equally skilful surgeon with all the humility of a bulldozer.

The Boiling Frog

Commuting days until retirement: 370

In my post a while back about psychogeography, I mentioned a book called Tunnel Visions by Christopher Ross. Ross is the man who took a job as a station assistant on the London Underground, primarily just to observe and reflect on his fellow humans as they travelled, the very readable result being this book. I had mislaid it in the disordered book-drifts that lie about our house, but finding it again recently I re-read it. In the book Ross mentions a previous professional job which he left in order to travel and experience other cultures – he occasionally refers to these by way of contextualising some of our own peculiarities. Returning to England, he took the Underground job to further his own particular philosophical investigations. I was interested to find that, in the book’s Amazon reviews, there’s one by someone who came across him in his original job, and and mentions that he was a very capable tax lawyer. Evidently his need to engage with the world in a fresh way outweighed any material benefits or intellectual satisfaction in the job. Here’s his preferred metaphor for the fate he had escaped:

Boiling frog

Wikimedia Commons / Arthur G Cox

The boiling frog was a favourite object lesson and it was always hopping into my mind. In order to push back the boundaries of scientific discovery, scientists found out that frogs have nervous systems poorly adapted to register slow incremental change. So if you sit one in a saucepan of water and slowly heat it, the poor frog will stay where it is and boil to death.
We become chiropodists and lawn mower salesmen by a series of imperceptible ‘choices’ and by the time we realise what we’ve done, we’re boiled.

Well, I’ve no doubt that the world is replete with happy and fulfilled chiropodists and lawn mower salesmen, but you know what he means. Of course there are those who have a life plan settled in their minds when they are barely out of childhood, and proceed to stick to it – I’ve met one or two. Michael Heseltine, the former Conservative minister, famously wrote his down on the back of an envelope at the start of his career. It involved making a lot of money, entering parliament, and audaciously ended with ‘Prime Minister’. Only that last step eluded him.

And there are those like Ross himself – the free spirits – who are determined to try and make the world yield up its secrets by taking a more fresh and unconventional approach, probably giving up much comfort and prosperity in order to do so.

But most of us are much more like the frog. You start off in the career stakes with one of the jobs that happen to be there when you need one – most likely the one where your personality chances to gel with the people in the interview. And sooner or later you happen to see an ad for another job with a better salary, and which you realise you now have the experience for – and so on. And by this time you may have children who need feeding and housing, and you can’t step off the treadmill even if you want to.

I rather think, looking around the train at my fellow commuters, that some of them have the air of well-cooked amphibians. Maybe I can claim not to be thoroughly boiled, but can’t deny that I am at least lightly braised. I envy Ross his courage and independence, but I wouldn’t have given up the chance to raise children: parents can find at least as much fulfillment, and valuable life experience, as any philosophical nomad.

Maybe that’s something he still aims to fit his life around; I don’t know – he doesn’t say. I have searched the internet to find out what he’s been up to in the 12 years or so since the book was published,  but haven’t found anything. But I doubt whether he’s fallen back into the evil scientist’s flask.

What about you? I’d like to hear whether you consider yourself to be gently simmering, or good, fresh and raw.

Consciousness 3 – The Adventures of a Naive Dualist

Commuting days until retirement: 408

A long gap since my last post: I can only plead lack of time and brain-space (or should I say mind-space?). Anyhow, here we go with Consciousness 3:

Coronation

A high point for English Christianity in the 50s: the Queen’s coronation. I can remember watching it on a relative’s TV at the age of 5

I think I must have been a schoolboy, perhaps just a teenager, when I was first aware that the society I had been born into supported two entirely different ways of looking at the world. Either you believed that the physical world around us, sticks, stones, fur, skin, bones – and of course brains – was all that existed; or you accepted one of the many varieties of belief which insisted that there was more to it than that. My mental world was formed within the comfortable surroundings of the good old Church of England, my mother and father being Christians by conviction and by social convention, respectively. The numinous existed in a cosy relationship with the powers-that-were, and parents confidently consigned their children’s dead pets to heaven, without there being quite such a Santa Claus feel to the assertion.

But, I discovered, it wasn’t hard to find the dissenting voices. The ‘melancholy long withdrawing roar’ of the ‘sea of faith’ which Matthew Arnold had complained about in the 19th century was still under way, if you listened out for it. Ever since Darwin, and generations of physicists from Newton onwards, the biological and physical worlds had appeared to get along fine without divine support; and even in my own limited world I was aware of plenty of instances of untimely deaths of innocent sufferers, which threw doubt on God’s reputedly infinite mercy.

John Robinson

John Robinson, Bishop of Woolwich (Church Times)

And then in the 1960s a brick was thrown into the calm pool of English Christianity by a certain John Robinson, the Bishop of Woolwich at the time. It was a book called Honest to God, which sparked a vigorous debate that is now largely forgotten. Drawing on the work of other radical theologians, and aware of the strong currents of atheism around him, Robinson argued for a new understanding of religion. He noted that our notion of God had moved on from the traditional old man in the sky to a more diffuse being who was ‘out there’, but considered that this was also unsatisfactory. Any God whom someone felt they had proved to be ‘out there’ “would merely be a further piece of existence, that might conceivably have not been there”. Rather, he says, we must approach from a different angle.

God is, by definition, ultimate reality. And one cannot argue whether ultimate reality exists.

My pencilled zig-zags in the margin of the book indicate that I felt there was something wrong with this at the time. Later, after studying some philosophy, I recognised it as a crude form of Anselm’s ontological argument for the existence of God, which is rather more elegant, but equally unsatisfactory. But, to be fair, this is perhaps missing the point a little. Robinson goes on to say that “one can only ask “what ultimate reality is like – whether it… is to be described in personal or impersonal categories.” His book proceeds to develop the notion of God as in some way identical with reality, rather than as a special part of it. One might cynically characterise this as a response to atheism of the form “if you can’t beat them, join them” – hence the indignation that the book stirred in religious circles.

Teenage reality

But, leaving aside the well worn blogging topic of the existence of God, there was the teenage me, still wondering about ‘ultimate reality’, and what on earth, for want of a better expression, that might be. Maybe the ‘personal’ nature of reality which Robinson espoused was a clue. I was a person, and being a person meant having thoughts, experiences – a self, or a subjective identity.  My experiences seemed to be something quite other from the objective world described by science – which, according to the ‘materialists’ of the time, was all that there was. What I was thinking of then was the topic of my previous post, Consciousness 2 – my qualia, although I didn’t know that word at the time. So yes, there were the things around us (including our own bodies and brains), our knowledge and understanding of which had been, and was, advancing at a great rate. But it seemed to me that no amount of knowledge of the mechanics of the world could ever explain these private, subjective experiences of mine (and I assumed, of others). I was always strongly motivated to believe that there was no limit to possible knowledge – however much we knew, there would always be more to understand. Materialsm, on the other hand, seemed to embody the idea of a theoretically finite limit to what could be known – a notion which gave me a sense of claustrophobia (of which more in a future post).

So I made my way about the world, thinking of my qualia as the armour to fend off the materialist assertion that physics was the whole story. I had something that was beyond their reach: I was a something of a young Cartesian, before I had learned about Descartes. It was a another few years before ‘consciousness’ became a legitimate topic of debate in philosophy and science. One commentator I have read dates this change to the appearance of Nagel’s paper What is it like to be a Bat in 1973, which I referred to in Consciousness 1. Seeing the debate emerging, I was tempted to preen myself with the horribly arrogant thought that the rest of the world had caught up with me.

The default position

Philosophers and scientists are still seeking to find ways of assimilating consciousness to physics: such physicalism, although coming in a variety of forms, is often spoken of as the default, orthodox position. But although my perspective has changed quite a lot over the years, my fundamental opposition to physicalism has not. I am still at heart the same naive dualist I was then. But I am not a dogmatic dualist – my instinct is to believe that some form of monism might ultimately be true, but beyond our present understanding. This consigns me into another much-derided category of philosophers – the so-called ‘mysterians’.

But I’d retaliate by pointing out that there is also a bit of a vacuum at the heart of the physicalist project. Thoughts and feelings, say its supporters, are just physical things or events, and we know what we mean by that, don’t we? But do we? We have always had the instinctive sense of what good old, solid matter is – but you don’t have to know any physics to realise there are problems with the notion. If something were truly solid it would entail that it was infinitely dense – so the notion of atomism, starting with the ancient Greeks, steadily took hold. But even then, atoms can’t be little solid balls, as they were once imagined – otherwise we are back with the same problem. In the 20th century, atomic physics confirmed this, and quantum theory came up with a whole zoo of particles whose behaviour entirely conflicted with our intuitive ideas gained from experience; and this is as you might expect, since we are dealing with phenomena which we could not, in principle, perceive as we perceive the things around us. So the question “What are these particles really like?” has no evident meaning. And, approaching the problem from another standpoint, where psychology joins hands with physics, it has become obvious that the world with which we are perceptually familiar is an elaborate fabrication constructed by our brains. To be sure, it appears to map on to the ‘real’ world in all sorts of ways, but has qualities (qualia?) which we supply ourselves.

Truth

So what true, demonstrable statements can be made about the nature of matter? We are left with the potently true findings – true in the the sense of explanatory and predictive power – of quantum physics. And, when you’ve peeled away all the imaginative analogies and metaphors, these can only be expressed mathematically. At this point, rather unexpectedly, I find myself handing the debate back to our friend John Robinson. In a 1963 article in The Observer newspaper, heralding the publication of Honest to God, he wrote:

Professor Herman Bondi, commenting in the BBC television programme, “The Cosmologists” on Sir James Jeans’s assertion that “God is a great mathematician”, stated quite correctly that what he should have said is “Mathematics is God”. Reality, in other words, can finally be reduced to mathematical formulae.

In case this makes Robinson sound even more heretical than he in fact was, I should note that he goes on to say that Christianity adds to this “the deeper reliability of an utterly personal love”. But I was rather gratified to find this referral to the concluding thoughts of my post by the writer I quoted at the beginning.

I’m not going to speculate any further into such unknown regions, or into religious belief, which isn’t my central topic. But I’d just like to finish with the hope that I have suggested that the ‘default position’ in current thinking about the mind is anything but natural or inevitable.

W G Sebald

Commuting days until retirement: 477

WGSebaldIt’s a rather sad experience to discover a contemporary writer who immediately engages you, only to learn that he has recently died – so that once you have exhausted his published novels there will be no more. This was my experience in the case of W G Sebald.

Sebald was a German writer, an immensely intelligent and learned man, who had learned early in life of his family’s former involvement in Nazi regime activity. He preferred to become an expatriate, studying in Manchester and taking up an academic post there. He then lived in Switzerland for a short time before returning to England to become a lecturer, and then Professor of European Literature at the University of East Anglia. He held this post at the time of his death in a car crash in Norfolk in late 2001. He had evidently suffered a heart attack at the wheel.

austerlitzAs far as I remember I came across him simply by picking up one of his books in a shop. At first sight I wasn’t sure what to make of it, but it seemed so different from anything else I had come across that I had to buy it. The book was Austerlitz, his last published novel. Embarking on a Sebald novel takes you into an unfamiliar but compelling experience, where you are never quite sure what is entirely fictional and what is reference to factual reality. This ambiguity is enhanced by the grainy black and white photos, engravings and other visuals which punctuate the text, appearing to depict real things but at the same time conjuring up a dream-like atmosphere. The basic narration is always first person, but it’s never quite clear to what extent the persona is Sebald’s own. This effect is further enhanced in Austerlitz, as most of the story is spoken by the eponymous character, but told to the first person narrator who meets and talks with him in a variety of settings across Europe.

Sebald’s prose style is also unique. Sentences will generally be long, punctuated with commas, and in the course of a single sentence he will often digress from the here-and-now, maybe to refer to some historical fact, or describe the foibles of some character, before returning to the present. A Sebald novel is an other-worldly experience, both real and not-real, and makes reading him (for me, anyway) uniquely pleasurable. I don’t often re-read novels, conscious of everything out there which I have yet to read and probably never will – but Sebald is an exception to this. As is shown by the interview I have included below, the texture of his work is laden with allusions and metaphors, few of which I have been aware of on a first reading. He writes in German, and is translated, but he was generally closely involved in the translations himself, so we can be sure of their authenticity. I would recommend all four of his novels: The Emigrants, Vertigo, The Rings of Saturn, and Austerlitz.

My post was prompted by a piece in this weekend’s Guardian Review, where I was pleased to find that a new book of his essays is being published next month. (There are other books of essays and poetry that I have not yet read – I’m saving them up for myself.) One of the essays is reprinted there: if I had read just the first sentences without knowing who the author was, I would immediately have recognised him as Sebald.

He was, according to all accounts, a shy, modest and delightful man. He was known univerally to those who knew him as ‘Max’. A couple of years ago I was lucky enough to meet someone who had been a friend of his – the Hungarian-born poet George Szirtes, a fellow member of staff at the Univerity of East Anglia. I learned how well liked he was, and what enormous sadness friends felt when he died.

Looking around the internet, I found a YouTube interview with an American radio station, made just before his death, so I have embedded it here.

We are all Newtonians now – or are we?

Commuting days until retirement: 495

Browsing in a bookshop the other day I found a small book about Newton by Peter Ackroyd. His biographies are mostly about literary figures, and I didn’t know about this one – the prospect of Ackroyd on Isaac Newton seemed an enticing novelty. It lasted a few train journeys, and didn’t disappoint. I suppose I was familiar with the outline of Newton’s work, and knew something about his difficult personality, but this filled some of the gaps in my knowledge wonderfully.

Isaac Newton(Wikimedia Commons)

Isaac Newton (Wikimedia Commons)

There are perhaps three central achievements of Newton’s – each one groundbreaking in itself: his elucidation of the nature of light and colour; his invention of the calculus (‘Fluxions’ in his day) as a mathematical technique, and, above all, his unification of the movement of all physical bodies, cosmic and terrestrial, in a mathematical framework bound together by his laws of motion and gravitation. It’s true that calculus was, as we know now, independently hit upon by Leibniz, although at the time there was a fierce controversy, with each suspecting the other of plagiarism. Leibniz had published first, using a more elegant notation, but Newton had certainly been working on his Fluxions for some time before. The flames of the dispute were jealously fanned by Newton, who, once crossed or criticised, rarely forgave an opponent.

Robert Hooke

What I hadn’t realised was that the notion of gravitation, and even the inverse square law governing the strength of attraction, had been discussed by others prior to Newton’s synthesis in Principia Mathematica. It was Robert Hooke – a polymath and versatile scientific investigator himself – who had published these ideas in his Micrographia, without claiming to have originated them himself, and who wrote to Newton to draw his attention to them. They had previous quarrelled over Newton’s work on light and colour, Hooke having claimed some precedence in his own work, but Hooke had conceded to Newton, accepting that he had “abilities much inferior to yours.” This was the sort of thing that was music to Newton’s ears, who wrote back in a conciliatory vein, saying, in the famous phrase, that “if I have seen farther, it is by standing on the shoulders of giants.” There is some uncertainty as to whether this was a deliberate reference to Hooke’s own short and stunted stature.

But relations with Hooke broke down entirely when he pressed his claim to an acknowledgement in the Principia for his own previous work. Newton was furious, and never forgave him. Hooke was for many years secretary of the Royal Society, a body which, to start with, Newton had an awkward relationship, particularly given the presence of Hooke. But after Hooke’s death, Newton became president of the Society, and the relatively modest reputation which Hooke has today is thought to be due to Newton’s attempts to bury it, once he was in a position to do so. No authentic portrait of Hooke remains, and this is probably Newton’s doing.

By contrast, Newton sat for quite a number of portraits – an indication of his vanity. But he was of course held in high regard by most of his contemporaries for his prodigious talents. Those who got on well with him mostly had the skill to negotiate their way carefully around his prickly personality. An example was Edmond Halley (he of Halley’s comet) who had the task of passing Hooke’s claim to Newton, but managed to do so without himself falling into Newton’s disfavour.

Passions

Newton was long-lived, dying aged 84 – perhaps due to his ascetic style of life and his unquenchable enthusiasm for whatever was his current preoccupation. The early part of his life was mostly spent in Cambridge where he became a fellow, and then the second Lucasian Professor of Mathematics. He lived a mostly solitary existence, and when working on some problem would often work through the night, neglect bodily needs and be deaf to distractions. His absent-mindedness was legendary. Hardly surprising, given these tendencies and his awkward personality, that he was not known ever to have had a close relationship with any individual, sexual or otherwise.  Acts of kindness were not unknown, however, and he made many charitable donations in his later, prosperous years. He did strike up one or two friendships, and was fondly protective towards his niece, who kept house for him when he lived in London in later years.

When his mathematical powers waned with age, he found a new talent for administration in his fifties, when offered the post of Warden of the Royal Mint (and later Master). His predecessors had been lazy placemen for whom the post was a sinecure, and it’s thought that, on his appointment, perhaps 95% of the currency was counterfeit. Over succeeding years Newton turned the full force of his concentration to the task, and put the nation’s currency on a sound footing. Forgers were single-mindedly pursued to the gallows, which was where you ended up in those days if convicted of counterfeiting the currency.

So the last part of Newton’s life was spent prosperously, and in the enjoyment of a vast reputation, presiding over his twin fiefdoms of the Royal Mint and the Royal Society, and doing so right up until his death. But I have not mentioned his two other major intellectual enthusiasms, beside the scientific work I have described. One was alchemy – not then distinct from what we now call chemistry. Alchemists of course are remembered mainly for their efforts to create gold, and hence fabulous wealth – but this was not Newton’s aim. The subject was full of occult knowledge and arcane secrets, and for Newton this was one route to a revelation of the universe’s true, unknown nature, and he pursued it assiduously, having a vast library and spending at least as much time on it as his work in what is for us mainstream science. It also had a practical outcome, since he developed from it a thorough knowledge of metallurgy which he put to use in his work in administering the coinage.

His third passion was his research into the history of Christianity and the church. Newton was a deeply pious man, more in a private than a public way. This was partly because Newton’s particular faith was a heretical one, and would have been dangerously so in the earlier part of his life, when England was ruled by the Catholic James II. Newton was obsessed with the now largely forgotten controversy concerning the opposed church fathers Arius and Athanasius. Arian doctrine (not to be confused with the ‘Aryan’ 19th and 20th century racial dogma) held that Christ was a subordinate entity to God, and denied the Holy Trinity taught by Athanasius, and adopted by the mainstream church. For Newton, Arianism was the true faith, whose origins, he believed, could be traced back beyond the Christian era, and was the only way to approach the reality of God.

It almost goes without saying that these three obsessions were not independent of one another in Newton’s mind. For him they all served the same purpose – to uncover the mysteries of the universe and the nature of God. Gravitation was a controversial topic at the time, in virtue of its assertion that one body could act upon another without physical contact. (Perhaps a sort of parallel with the issues we have today with the phenomenon of quantum entanglement.) For Newton, the concept was all of a piece with the mysterious action of God – a window into the nature of reality.

Of course Newton’s scientific conception of the universe has now been radically modified by the twentieth century developments of relativity and quantum theory. But there’s a more fundamental sense in which we are still Newtonians: his towering achievement was the scheme of the universe as an integrated whole, governed by mathematically described laws (with some honours also going to his predecessor Galileo). This is the framework within which all our modern scientific endeavours take place.

Estrangement

The brothers as they appear in the book (faces obscured)

The brothers as they appear in the book (faces obscured)

So why the note of uncertainty in my title? To explain this I want to digress by describing an image which came into my mind while thinking about it. A few years back, the local people where I live produced a book about our village’s history. An appeal went out for any period photographs that might be borrowed to illustrate the book, and there was a big response. The organiser gave me the task of scanning in all these photos for the book’s publishers, and one them sticks in my mind. It showed two brothers who were local characters during the 1930s standing in a garden, and a closer examination showed that it had been taken at a wedding. They are wearing their best suits, and are sporting buttonholes. Why is the setting not so immediately obvious? Because the photo had been crudely ripped in two down the middle, with both the brothers in the left half. We can see that one of them is the groom, and that the missing right half contained his bride – only her hand is visible, nestling in the crook of his arm.

I found this mute evidence of some anguished estrangement from the past rather moving. What had seemed like a happy union at the time now had the feminine half of it expunged by someone who was determined that she no longer deserved any place in their thoughts. Yes – you get my drift. The enterprise of science now prefers to go it alone along its own, masculine, analytical path, with any attendant mystery ripped out of the picture, leaving only the barest hint. (See my thoughts on atheism in the previous post.)

It’s worth returning to Newton’s own imagery and repeating the often-quoted passage he wrote towards the end of his life:

I do not know what I may appear to the world, but to myself I seem to have been only like a boy playing on the sea-shore, and diverting myself in now and then finding a smoother pebble or a prettier shell than ordinary, whilst the great ocean of truth lay all undiscovered before me.

Although he was an arrogant man in his personal dealings (and the opening phrase here hints at how conscious he is of his reputation), I don’t believe this to be mock-modesty. He also was genuinely pious, and all too aware of the mystery surrounding what he had learned of the universe. Today, we look forward to finishing the task of enumerating the pebbles and shells, and are happy to ignore the ocean. In this sense we are more arrogant than he was, and that’s the source of my doubt as to whether we are really Newtonians now.


Ackroyd, Peter: Newton. Vintage Books, 2007