A Passage of Time

I have let rather too much time elapse since my last post. What are my excuses? Well, I mentioned in one post how helpful my commuter train journeys were in encouraging the right mood for blog-writing – and since retiring such journeys are few and far between. To get on the train every time I felt a blog-post coming on would not be a very pension-friendly approach, given current fares. My other excuse is the endless number of tasks around the house and garden that have been neglected until now. At least we are now starting to create a more attractive garden, and recently took delivery of a summerhouse: I am hoping that this could become another blog-friendly setting.

Time travel inventor

tvtropes.org

But since a lapse of time is my starting point, I could get back in by thinking again about the nature of time. Four years back (Right Now, 23/3/13) I speculated on the issue of why we experience a subjective ‘now’ which doesn’t seem to have a place in objective physical science. Since then I’ve come across various ruminations on the existence or non-existence of time as a real, out-there, component of the world’s fabric. I might have more to say about this in the, er, future – but what appeals to me right now is the notion of time travel. Mainly because I would welcome a chance to deal with my guilty feelings by going back in time and plugging the long gap in this blog over the past months.

I recently heard about an actual time travel experiment, carried out by no less a person that Stephen Hawking. In 2009, he held a party for time travellers. What marked it out as such was that he sent out the invitations after the party took place. I don’t know exactly who was invited; but, needless to say, the canapes remained uneaten and the champagne undrunk. I can’t help feeling that if I’d tried this, and no one had turned up, my disappointment would rather destroy any motivation to send out the invitations afterwards. Should I have sent them anyway, finally to prove time travel impossible? I can’t help feeling that I’d want to sit on my hands, and leave the possibility open for others to explore.  But the converse of this is the thought that, if the time travellers had turned up, I would be honour-bound to despatch those invites; the alternative would be some kind of universe-warping paradox. In that case I’d be tempted to try it and see what happened.

Elsewhere, in the same vein, Hawking has remarked that the impossibility of time travel into the past is demonstrated by the fact that we are not invaded by hordes of tourists from the future. But there is one rather more chilling explanation for their absence: namely that the time travel is theoretically possible, but we have no future in which to invent it. Since last year that unfortunately looks a little more likely, given the current occupant of the White House. That such a president is possible makes me wonder whether the universe is already a bit warped.

Should you wish to escape this troublesome present and escape into a different future, whatever it might hold, it can of course be done. As Einstein famously showed in 1905, it’s just a matter of inventing for yourself a spaceship that can accelerate to nearly the speed of light and taking a round trip in it. And of course this isn’t entirely science fiction: astronauts and satellites – even airliners – regularly take trips of microseconds or so into the future; and indeed our now familiar satnav devices wouldn’t work if this effect weren’t taken into account.

But the problem of course arises if you find the future you’ve travelled to isn’t one you like. (Trump, or some nepotic Trumpling,  as world president? Nuclear disaster? Both of these?) Whether you can get back again by travelling backward in time is not a question that’s really been settled. Indeed, it’s the ability to get at the past that raises all the paradoxes – most famously what would happen if you killed your grandparents or stopped them getting together.

Marty McFly with his teenage mother

Marty McFly with his teenage mother

This is a furrow well-ploughed in science fiction, of course. You may remember the Marty McFly character in the film Back to the Future, who embarks on a visit to the past enabled by his mad scientist friend. It’s one way of escaping from his dysfunctional, feckless parents, but having travelled back a generation in time he finds himself in being romantically approached by his teenage mother. He manages eventually to redirect her towards his young father, but on returning to the present finds his parents transformed into an impossibly hip and successful couple.

Then there’s Ray Bradbury’s story A Sound of Thunder, where tourists can return to hunt dinosaurs – but only those which were established to have been about to die in any case, and any bullets must then be removed from their bodies. As a further precaution, the would-be hunters are kept away from the ground by a levitating path, to prevent any other paradox-inducing changes to the past. One bolshie traveller breaks the rules however, reaches the ground, and ends up with a crushed butterfly on the sole of his boot. On returning to the present he finds that the language is subtly different, and that the man who had been the defeated fascist candidate for president has now won the election. (So, thinking of my earlier remarks, could some prehistoric butterfly crusher, yet to embark on his journey, be responsible for the current world order?)

My favourite paradox is the one portrayed in a story called The Muse by Anthony Burgess, in which – to leave a lot out – a time travelling literary researcher manages to meet William Shakespeare and question him on his work. Shakespeare’s eye alights on the traveller’s copy of the complete works, which he peruses and makes off with, intending to mine it for ideas. This seems like the ideal solution for struggling blog-writers like me, given that, having travelled forward in time and copied what I’d written on to a flash drive, I could return to the present and paste it in here. Much easier.

But these thoughts put me in mind of a more philosophical issue with time which has always fascinated me – namely whether it’s reversible. We know how to travel forward in time; however when it comes to travelling backward there are various theories as to how it might, in theory, be done, but no-one is very sure. Does this indicate a fundamental asymmetry in the way time works? Of course this is a question that has been examined in much greater detail in another context: the second law of thermodynamics, we are told, says it all.

Let’s just review those ideas. Think of running a film in reverse. Might it show anything that could never happen in real, forward time? Well of course if it were some sort of space film which showed planets orbiting the sun, or a satellite around the earth, then either direction is possible. But, back on earth, think of all those people you’d see walking backwards. Well, on the face of it, people can walk backwards, so what’s the problem? Well, here’s one of many that I could think of: imagine that one such person is enjoying a backward country walk on a path through the woods. As she approaches a protruding branch from a sapling beside the path, the branch suddenly whips sideways towards her as if to anticipate her passing, and then, laying itself against her body, unbends itself as she backward-walks by, and has then returned to its rest position as she recedes. Possible? Obviously not. But is it?

I’m going to argue against the idea that there is a fundamental ‘arrow of time’, and that despite the evident truth of the laws of thermodynamics and the irresistible tendency we observe toward increasing disorder, or entropy, there’s nothing ultimately irreversible about physical processes. I’ve deliberately chosen an example which seems to make my case harder to maintain, to see if I can explain my way out of it. You will have had the experience of walking by a branch which projects across your path, and noticing how your body bends it forwards as you pass, and seeing it spring back to its former position as you continue on. Could we envisage a sequence of events in the real world where all this happened in reverse?

Before answering that I’m going to look at a more familiar type of example. I remember being impressed many years ago by an example of the type of film I mentioned, illustrating the idea of entropy. It showed someone holding a cup of tea, and then letting go of it, with the expected results. Then the film was reversed. The mess of spilt tea and broken china on the floor drew itself together, and as the china pieces reassembled themselves into a complete cup and saucer, the tea obediently gathered itself back into the cup. As this process completed the whole assembly launched itself from the floor and back into the hands of its owner.

Obviously, that second part of the sequence would never happen in the real world. It’s an example of how, left to itself, the physical world will always progress to a state of greater disorder, or entropy. We can even express the degree of entropy mathematically, using information theory. Case closed, then – apart, perhaps, from biological evolution? And even then it can be shown that if some process – like the evolution of a more sophisticated organism – decreases entropy, it will always be balanced by a greater increase elsewhere; and so the universe’s total amount of entropy increases. The same applies to our own attempts to impose order on the world.

So how could I possibly plead for reversibility of time? Well, I tend to think that this apparent ‘arrow’ is a function of our point of view as limited creatures, and our very partial perception of the universe. I would ask you to imagine, for a moment, some far more privileged being – some sort of god, if you like – who is able to track all the universe’s individual particles and fields, and what they are up to. Should this prodigy turn her attention to our humble cup of tea, what she saw, would I think, be very different from the scene as experienced through our own senses. From her perspective, the clean lines of the china cup which we see would become less defined – lost in a turmoil of vibrating molecules, themselves constantly undergoing physical and chemical change. The distinction between the shattered cup on the floor and the unbroken one in the drinker’s hands would be less clear.

Colour blindness testWhat I’m getting at is the fact that what we think of as ‘order’ in our world is an arrangement that seems significant only from one particular point of view determined by the scale and functionality of our senses: the arrangement we think of as ‘order’ floats like an unstable mirage in a sea of chaos. As a very rough analogy, think of those patterns of coloured dots used to detect colour blindness. You can see the number in the one I’ve included only if your retinal cells function in a certain way; otherwise all you’d see would be random dots.

And, in addition to all this, think of the many arrangements which (to us) might appear to have ‘order’ – all the possible drinks in the cup, all the possible cup designs – etc, etc. But compared to all the ‘disordered’ arrangements of smashed china, splattered liquid and so forth, the number of potential configurations which would appeal to us as being ‘ordered’ is truly infinitesimal. So it follows that the likelihood of moving from a state we regard as ‘disordered’ to one of ‘order’ is unimaginably slim; but not, in principle, impossible.

So let’s imagine that one of these one-in-a-squillion chances actually comes off. There’s the smashed cup and mess of tea on the floor. It’s embedded in a maze of vibrating molecules making up the floor, the surrounding air, and so on. And, in this case it so happens that the molecular impacts between the elements of the cup and the tea, and their surroundings combine so as to nudge them all back into their ‘ordered’ configuration, and boost them back off the floor and back into the hands of the somewhat mystified drinker.

Yes, the energy is there to make that happen – it just has to come together in exactly the correct, fantastically unlikely way. I don’t know how to calculate the improbability of this, but I should imagine that to see it happen we would need to do continual trials for a time period which is some vast multiple of the age of the universe. (Think monkeys typing the works of Shakespeare, and then multiply by some large number.) In other words, of course, it just doesn’t happen in practice.

But, looked at another way, such unlikely things do happen. Think of when we originally dropped the cup, and ended up with some sort of mess on the floor – that is, out of the myriad of other possible messes that could have been created, had the cup been dropped at a slightly different angle, the floor had been dirtier, the weather had been different – and so on. How likely is that exact, particular configuration of mess that we ended up with? Fantastically unlikely, of course – but it happened. We’d never in practice be able to produce it a second time.

So of all these innumerable configurations of matter – whether or not they are one of the tiny minority that seem to us ‘ordered’ – one of them happens with each of the sorts of event we’ve been considering. The universe at large is indifferent to our notion of ‘order’, and at each juncture throws up some random selection of the unthinkably large number of possibilities. It’s just that these ordered states are so few in number compared to the disordered ones that they never in practice come about spontaneously, but only when we deliberately foster them into being, by doing such things as manufacturing teacups, or making tea.

Let’s return, then, to the branch that our walker brushes past on the woodland footpath, and give that a similar treatment. It’s a bit simpler, if anything: we just require the astounding coincidence that, as the backwards walker approaches the branch, the random Brownian motion of an unimaginably large number of air molecules just happen to combine to give the branch a series of rhythmic, increasing nudges. It appears to oscillate with increasing amplitude until one final nudge lays it against the walker’s body just as she passes. Not convinced? Well, this is just one of the truly countless possible histories of the movement of a vast number of air molecules – one which has a consequence we can see.

Remember that the original Robert Brown, of Brownian motion fame, did see random movements of pollen grains in water, and since it didn’t occur to him that the water molecules were responsible for this; he thought it was a property of the pollen grains. Should we happen to witness such an astronomically unlikely movement of the tree, we would suspect some mysterious bewitchment of the tree itself, rather than one specific and improbably combination of air molecule movements.

You’ll remember that I was earlier reflecting that we know how to travel forwards in time, but that backward time travel is more problematic. So doesn’t this indicate another asymmetry – another evidence of an arrow of time? Well I think the right way of thinking about this emerges when we are reminded that this very possibility of time travel was a consequence of a theory called ‘relativity’. So think relative. We know how to move forward in time relative to other objects in our vicinity. Equally, we know how they could move forward in time relative to us. Which of course means that we’d be moving backward relative to them. No asymmetry there.

Maybe the one asymmetry in time which can’t be analysed a way is our own subjective experience of moving constantly from a ‘past’ into a ‘future’ – as defined by our subjective ‘now’. But, as I was pointing out three years ago, this seems to be more a property of ourselves as experiencing creatures, rather than of the objective universe ‘out there’.

I’ll leave you with one more apparent asymmetry. If processes are reversible in time, why do we only have records of the past, and not records of the future? Well, I’ve gone on long enough, so in the best tradition of lazy writers, I will leave that as an exercise for the reader.

The Mathematician and the Surgeon

Commuting days until retirement: 108

After my last post, which, among other things, compared differing attitudes to death and its aftermath (or absence of one) on the part of Arthur Koestler and George Orwell, here’s another fruitful comparison. It seemed to arise by chance from my next two commuting books, and each of the two people I’m comparing, as before, has his own characteristic perspective on that matter. Unlike my previous pair both could loosely be called scientists, and in each case the attitude expressed has a specific and revealing relationship with the writer’s work and interests.

The Mathematician

The first writer, whose book I came across by chance, has been known chiefly for mathematical puzzles and games. Martin Gardner was born in Oklahoma USA in 1914; his father was an oil geologist, and it was a conventionally Christian household. Although not trained as a mathematician, and going into a career as a journalist and writer, Gardner developed a fascination with mathematical problems and puzzles which informed his career – hence the justification for his half of my title.

Martin Gardner

Gardner as a young man (Wikimedia)

This interest continued to feed the constant books and articles he wrote, and he was eventually asked to write the Scientific American column Mathematical Games which ran from 1956 until the mid 1980s, and for which he became best known; his enthusiasm and sense of fun shines through the writing of these columns. At the same time he was increasingly concerned with the many types of fringe beliefs that had no scientific foundation, and was a founder member of PSICOPS,  the organisation dedicated to the exposing and debunking of pseudoscience. Back in February last year I mentioned one of its other well-known members, the flamboyant and self-publicising James Randi. By contrast, Gardner was mild-mannered and shy, averse from public speaking and never courting publicity. He died in 2010, leaving behind him many admirers and a two-yearly convention – the ‘Gathering for Gardner‘.

Before learning more about him recently, and reading one of his books, I had known his name from the Mathematical Games column, and heard of his rigid rejection of things unscientific. I imagined some sort of skinflint atheist, probably with a hard-nosed contempt for any fanciful or imaginative leanings – however sane and unexceptionable they might be – towards what might be thought of as things of the soul.

How wrong I was. His book that I’ve recently read, The Whys of a Philosophical Scrivener, consists of a series of chapters with titles of the form ‘Why I am not a…’ and he starts by dismissing solipsism (who wouldn’t?) and various forms of relativism; it’s a little more unexpected that determinism also gets short shrift. But in fact by this stage he has already declared that

I myself am a theist (as some readers may be surprised to learn).

I was surprised, and also intrigued. Things were going in an interesting direction. But before getting to the meat of his theism he spends a good deal of time dealing with various political and economic creeds. The book was written in the mid 80s, not long before the collapse of communism, which he seems to be anticipating (Why I am not a Marxist) . But equally he has little time for Reagan or Thatcher, laying bare the vacuity of their over-simplistic political nostrums (Why I am not a Smithian).

Soon after this, however, he is striding into the longer grass of religious belief: Why I am not a Polytheist; Why I am not a Pantheist; – so what is he? The next chapter heading is a significant one: Why I do not Believe the Existence of God can be Demonstrated. This is the key, it seems to me, to Gardner’s attitude – one to which I find myself sympathetic. Near the beginning of the book we find:

My own view is that emotions are the only grounds for metaphysical leaps.

I was intrigued by the appearance of the emotions in this context: here is a man whose day job is bound up with his fascination for the powers of reason, but who is nevertheless acutely conscious of the limits of reason. He refers to himself as a ‘fideist’ – one who believes in a god purely on the basis of faith, rather than any form of demonstration, either empirical or through abstract logic. And if those won’t provide a basis for faith, what else is there but our feelings? This puts Gardner nicely at odds with the modish atheists of today, like Dawkins, who never tires of telling us that he too could believe if only the evidence were there.

But at the same time he is squarely in a religious tradition which holds that ultimate things are beyond the instruments of observation and logic that are so vital to the secular, scientific world of today. I can remember my own mother – unlike Gardner a conventional Christian believer – being very definite on that point. And it reminds me of some of the writings of Wittgenstein; Gardner does in fact refer to him,  in the context of the freewill question. I’ll let him explain:

A famous section at the close of Ludwig Wittgenstein’s Tractatus Logico-Philosophicus asserts that when an answer cannot be put into words, neither can the question; that if a question can be framed at all, it is possible to answer it; and that what we cannot speak about we should consign to silence. The thesis of this chapter, although extremely simple and therefore annoying to most contemporary thinkers, is that the free-will problem cannot be solved because we do not know exactly how to put the question.

This mirrors some of my own thoughts about that particular philosophical problem – a far more slippery one than those on either side of it often claim, in my opinion (I think that may be a topic for a future post). I can add that Gardner was also on the unfashionable side of the question which came up in my previous post – that of an afterlife; and again he holds this out as a matter of faith rather than reason. He explores the philosophy of personal identity and continuity in some detail, always concluding with the sentiment ‘I do not know. Do not ask me.’ His underlying instinct seems to be that there has to something more than our bodily existence, given that our inner lives are so inexplicable from the objective point of view – so much more than our physical existence. ‘By faith, I hope and believe that you and I will not disappear for ever when we die.’ By contrast, Arthur Koestler, you may remember,  wrote in his suicide note of ‘tentative hopes for a depersonalised afterlife’ – but, as it turned out, these hopes were based partly on the sort of parapsychological evidence which was anathema to Gardner.

And of course Gardner was acutely aware of another related mystery – that of consciousness, which he finds inseparable from the issue of free will:

For me, free will and consciousness are two names for the same thing. I cannot conceive of myself being self-aware without having some degree of free will… Nor can I imagine myself having free will without being conscious.

He expresses utter dissatisfaction with the approach of arch-physicalists such as Daniel Dennett, who,  as he says,  ‘explains consciousness by denying that it exists’. (I attempted to puncture this particular balloon in an earlier post.)

Martin Gardner

Gardner in later life (Konrad Jacobs / Wikimedia)

Gardner places himself squarely within the ranks of the ‘mysterians’ – a deliberately derisive label applied by their opponents to those thinkers who conclude that these matters are mysteries which are probably beyond our capacity to solve. Among their ranks is Noam Chomsky: Gardner cites a 1983 interview with the grand old man of linguistics,  in which he expresses his attitude to the free will problem (scroll down to see the relevant passage).

The Surgeon

And so to the surgeon of my title, and if you’ve read one of my other blog posts you will already have met him – he’s a neurosurgeon named Henry Marsh, and I wrote a post based on a review of his book Do No Harm. Well, now I’ve read the book, and found it as impressive and moving as the review suggested. Unlike many in his profession, Marsh is a deeply humble man who is disarmingly honest in his account about the emotional impact of the work he does. He is simultaneously compelled towards,  and fearful of, the enormous power of the neurosurgeon both to save and to destroy. His narrative swings between tragedy and elation, by way of high farce when he describes some of the more ill-conceived management ‘initiatives’ at his hospital.

A neurosurgical operation

A neurosurgical operation (Mainz University Medical Centre)

The interesting point of comparison with Gardner is that Marsh – a man who daily manipulates what we might call physical mind-stuff – the brain itself – is also awed and mystified by its powers:

There are one hundred billion nerve cells in our brains. Does each one have a fragment of consciousness within it? How many nerve cells do we require to be conscious or to feel pain? Or does consciousness and thought reside in the electrochemical impulses that join these billions of cells together? Is a snail aware? Does it feel pain when you crush it underfoot? Nobody knows.

The same sense of mystery and wonder as Gardner’s; but approached from a different perspective:

Neuroscience tells us that it is highly improbable that we have souls, as everything we think and feel is no more or no less than the electrochemical chatter of our nerve cells… Many people deeply resent this view of things, which not only deprives us of life after death but also seems to downgrade thought to mere electrochemistry and reduces us to mere automata, to machines. Such people are profoundly mistaken, since what it really does is upgrade matter into something infinitely mysterious that we do not understand.

Henry Marsh

Henry Marsh

This of course is the perspective of a practical man – one who is emphatically working at the coal face of neurology, and far more familiar with the actual material of brain tissue than armchair speculators like me. While I was reading his book, although deeply impressed by this man’s humanity and integrity, what disrespectfully came to mind was a piece of irreverent humour once told to me by a director of a small company I used to work for which was closely connected to the medical industry. It was a sort of a handy cut-out-and-keep guide to the different types of medical practitioner:

Surgeons do everything and know nothing. Physicians know everything and do nothing. Psychiatrists know nothing and do nothing.  Pathologists know everything and do everything – but the patient’s dead, so it’s too late.

Grossly unfair to all to all of them, of course, but nonetheless funny, and perhaps containing a certain grain of truth. Marsh, belonging to the first category, perhaps embodies some of the aversion from dry theory that this caricature hints at: what matters to him ultimately, as a surgeon, is the sheer down-to-earth physicality of his work, guided by the gut instincts of his humanity. We hear from him about some members of his profession who seem aloof from the enormity of the dangers it embodies, and seem able to proceed calmly and objectively with what he sees almost as the detachment of the psychopath.

Common ground

What Marsh and Gardner seem to have in common is the instinct that dry, objective reasoning only takes you so far. Both trust the power of their own emotions, and their sense of awe. Both, I feel, are attempting to articulate the same insight, but from widely differing standpoints.

Two passages, one from each book, seem to crystallize both the similarities and differences between the respective approaches of the two men, both of whom seem to me admirably sane and perceptive, if radically divergent in many respects. First Gardner, emphasising in a Wittgensteinian way how describing how things appear to be is perhaps a more useful activity than attempting to pursue any ultimate reasons:

There is a road that joins the empirical knowledge of science with the formal knowledge of logic and mathematics. No road connects rational knowledge with the affirmations of the heart. On this point fideists are in complete agreement. It is one of the reasons why a fideist, Christian or otherwise, can admire the writings of logical empiricists more than the writings of philosophers who struggle to defend spurious metaphysical arguments.

And now Marsh – mystified, as we have seen, as to how the brain-stuff he manipulates daily can be the seat of all experience – having a go at reading a little philosophy in the spare time between sessions in the operating theatre:

As a practical brain surgeon I have always found the philosophy of the so-called ‘Mind-Brain Problem’ confusing and ultimately a waste of time. It has never seemed a problem to me, only a source of awe, amazement and profound surprise that my consciousness, my very sense of self, the self which feels as free as air, which was trying to read the book but instead was watching the clouds through the high windows, the self which is now writing these words, is in fact the electrochemical chatter of one hundred billion nerve cells. The author of the book appeared equally amazed by the ‘Mind-Brain Problem’, but as I started to read his list of theories – functionalism, epiphenomenalism, emergent materialism, dualistic interactionism or was it interactionistic dualism? – I quickly drifted off to sleep, waiting for the nurse to come and wake me, telling me it was time to return to the theatre and start operating on the old man’s brain.

I couldn’t help noticing that these two men – one unconventionally religious and the other not religious at all – seem between them to embody those twin traditional pillars of the religious life: faith and works.

The Vault of Heaven

Commuting days until retirement: 250

Exeter Cathedral roof

The roof of Exeter Cathedral (Wanner-Laufer, Wikimedia Commons)

Thoughts are sometimes generated out of random conjunctions in time between otherwise unrelated events. Last week we were on holiday in Dorset, and depressing weather for the first couple of days drove us into the nearest city – Exeter, where we visited the cathedral. I had never seen it before and was more struck than I had expected to be. Stone and wood carvings created over the past 600 years decorate thrones, choir stalls and tombs, the latter bearing epitaphs ranging in tone from the stern to the whimsical. All this lies beneath the marvellous fifteeenth century vaulted roof – the most extensive known of the period, I learnt. Looking at this, and the cathedral’s astronomical clock dating from the same century, I imagined myself seeing them as a contemporary member of the congregation would have, and tried to share the medieval conception of the universe above that roof, reflected in the dial of the clock.

Astronomical Clock

The Astronomical Clock at Exeter Cathedral (Wikimedia Commons)

The other source of these thoughts was the book I happened to have finished that day: Max Tegmark’s Our Mathematical Universe*. He’s an MIT physics professor who puts forward the view (previously also hinted at in this blog) that reality is at bottom simply a mathematical object. He admits that it’s a minority view, scoffed at by many of his colleagues – but I have long felt a strong affinity for the idea. I have reservations about some aspects of the Tegmark view of reality, but not one of its central planks – the belief that we live in one universe among a host of others. Probably to most people the thought is just a piece of science fiction fantasy – and has certainly been exploited for all it’s worth by fiction authors in recent years. But in fact it is steadily gaining traction among professional scientists and philosophers as a true description of the universe – or rather multiverse, as it’s usually called in this context.

Nowadays there is a whole raft of differing notions of a multiverse, each deriving from separate theoretical considerations. Tegmark combines four different ones in the synthesis he presents in the book. But I think I am right in saying that the first time such an idea appeared in anything like a mainstream scientific context was the PhD thesis of a 1950s student at Princeton in the USA – Hugh Everett.

The thesis appeared in 1957; its purpose was to present an alternative treatment of the quantum phenomenon known as the collapse of the wave function. A combination of theoretical and experimental results had come together to suggest that subatomic particles (or waves – the duality was a central idea here) existed as a cloud of possibilities, until interacted with, or observed. The position of an electron, for example could be defined with a mathematical function – the wave function of Schrodinger – which assigned only a probability to each putative location. If, however, we were to put this to the test – to measure its location in practice, we would have to do this by means of some interaction, and the answer that would come back would be one specific position among the cloud of possibilities. But by carrying out such procedures repeatedly, it was shown that the probability of any specific result was given by the wave function. The approach to these results which became most widely accepted was the so-called ‘Copenhagen interpreatation’ of Bohr and others, which held that all the possible locations co-existed in ‘superposition’ until the measurement was made and the wave function ‘collapsed’. Hence some of the more famous statements about the quantum world: Einstein’s dissatisfaction with the idea that ‘God plays dice’; and Schrodinger’s well-known thought experiment aimed to test the Copenhagen interpretation to destruction – the cat which is presumed to be simultaneously dead and alive until its containing box is opened and the result determined.

Everett proposed that there was no such thing as the collapse of the wave function. Rather, each of the possible outcomes was represented in one real universe; it was as if the universe ‘branched’ into a number of equally real versions, and you, the observer, found yourself in just one of them. Of course, it followed that many copies of you each found themselves in slightly different circumstances, unlike the unfortunate cat which presumably only experienced those universes in which it lived. Needless to say, although Everett’s ideas were encouraged at the time by a handful of colleagues (Bryce DeWitt, John Wheeler) they were regarded for many years as a scientific curiosity and not taken further. Everett himself moved away from theoretical physics, and involved himself in practical technology, later developing an enjoyment of programming. He smoked and drank heavily and became obese, dying at the age of 51. Tegmark implies that this was at least partly a result of his neglect by the theoretical physics community – but there’s also evidence that his choices of career path and lifestyle derived from his natural inclinations.

During the last two decades of the 20th century, however, the multiverse idea began to be taken more seriously, and had some enthusiastic proponents such as the British theorist David Deutsch and indeed Tegmark himself.  In his book, Tegmark cites a couple of straw polls he took among theoretical physicists attending talks he gave, in 1997 and again in 2010. In the first case, out of a response of 48, 13 endorse the Copenhagen interpretation, and 8 the multiverse idea. (The remainder are mostly undecided, with a few endorsing alternative approaches). In 2010 there are 35 respondents, of whom none at all go for Copenhagen, and 16 for the multiverse. (Undecideds remain about the same – to 16 from 18). This seems to show a decisive rise in support for multiple universes; although I do wonder whether it also reflects which physicists who were prepared to attend Tegmark’s talks, his views having become more well known by 2010. It so happens that the drop in the respondent numbers – 13 – is the same as the disappearing support for the Copenhagen interpreation.

Nevertheless, it’s fair to say that the notion of a multiple universe as a reality has now entered the mainstream of theoretical science in a way that it had not done half a century ago. There’s an argument, I thought as I looked at that cathedral roof, that cosmology has been transformed even more radically in my lifetime than it had been in the preceding 500 years. The skill of the medieval stonemasons as they constructed the multiple rib vaults, and the wonder of the medieval congregation as they marvelled at the completed roof, were consciously directed to the higher vault of heaven that overarched the world of their time. Today those repeated radiating patterns might be seen as a metaphor for the multiple worlds that we are, perhaps, beginning dimly to discern.


*Tegmark, Max, Our Mathematical Universe: My Quest for the Ultimate Nature of Reality. Allen Lane/Penguin, 2014

We are all Newtonians now – or are we?

Commuting days until retirement: 495

Browsing in a bookshop the other day I found a small book about Newton by Peter Ackroyd. His biographies are mostly about literary figures, and I didn’t know about this one – the prospect of Ackroyd on Isaac Newton seemed an enticing novelty. It lasted a few train journeys, and didn’t disappoint. I suppose I was familiar with the outline of Newton’s work, and knew something about his difficult personality, but this filled some of the gaps in my knowledge wonderfully.

Isaac Newton(Wikimedia Commons)

Isaac Newton (Wikimedia Commons)

There are perhaps three central achievements of Newton’s – each one groundbreaking in itself: his elucidation of the nature of light and colour; his invention of the calculus (‘Fluxions’ in his day) as a mathematical technique, and, above all, his unification of the movement of all physical bodies, cosmic and terrestrial, in a mathematical framework bound together by his laws of motion and gravitation. It’s true that calculus was, as we know now, independently hit upon by Leibniz, although at the time there was a fierce controversy, with each suspecting the other of plagiarism. Leibniz had published first, using a more elegant notation, but Newton had certainly been working on his Fluxions for some time before. The flames of the dispute were jealously fanned by Newton, who, once crossed or criticised, rarely forgave an opponent.

Robert Hooke

What I hadn’t realised was that the notion of gravitation, and even the inverse square law governing the strength of attraction, had been discussed by others prior to Newton’s synthesis in Principia Mathematica. It was Robert Hooke – a polymath and versatile scientific investigator himself – who had published these ideas in his Micrographia, without claiming to have originated them himself, and who wrote to Newton to draw his attention to them. They had previous quarrelled over Newton’s work on light and colour, Hooke having claimed some precedence in his own work, but Hooke had conceded to Newton, accepting that he had “abilities much inferior to yours.” This was the sort of thing that was music to Newton’s ears, who wrote back in a conciliatory vein, saying, in the famous phrase, that “if I have seen farther, it is by standing on the shoulders of giants.” There is some uncertainty as to whether this was a deliberate reference to Hooke’s own short and stunted stature.

But relations with Hooke broke down entirely when he pressed his claim to an acknowledgement in the Principia for his own previous work. Newton was furious, and never forgave him. Hooke was for many years secretary of the Royal Society, a body which, to start with, Newton had an awkward relationship, particularly given the presence of Hooke. But after Hooke’s death, Newton became president of the Society, and the relatively modest reputation which Hooke has today is thought to be due to Newton’s attempts to bury it, once he was in a position to do so. No authentic portrait of Hooke remains, and this is probably Newton’s doing.

By contrast, Newton sat for quite a number of portraits – an indication of his vanity. But he was of course held in high regard by most of his contemporaries for his prodigious talents. Those who got on well with him mostly had the skill to negotiate their way carefully around his prickly personality. An example was Edmond Halley (he of Halley’s comet) who had the task of passing Hooke’s claim to Newton, but managed to do so without himself falling into Newton’s disfavour.

Passions

Newton was long-lived, dying aged 84 – perhaps due to his ascetic style of life and his unquenchable enthusiasm for whatever was his current preoccupation. The early part of his life was mostly spent in Cambridge where he became a fellow, and then the second Lucasian Professor of Mathematics. He lived a mostly solitary existence, and when working on some problem would often work through the night, neglect bodily needs and be deaf to distractions. His absent-mindedness was legendary. Hardly surprising, given these tendencies and his awkward personality, that he was not known ever to have had a close relationship with any individual, sexual or otherwise.  Acts of kindness were not unknown, however, and he made many charitable donations in his later, prosperous years. He did strike up one or two friendships, and was fondly protective towards his niece, who kept house for him when he lived in London in later years.

When his mathematical powers waned with age, he found a new talent for administration in his fifties, when offered the post of Warden of the Royal Mint (and later Master). His predecessors had been lazy placemen for whom the post was a sinecure, and it’s thought that, on his appointment, perhaps 95% of the currency was counterfeit. Over succeeding years Newton turned the full force of his concentration to the task, and put the nation’s currency on a sound footing. Forgers were single-mindedly pursued to the gallows, which was where you ended up in those days if convicted of counterfeiting the currency.

So the last part of Newton’s life was spent prosperously, and in the enjoyment of a vast reputation, presiding over his twin fiefdoms of the Royal Mint and the Royal Society, and doing so right up until his death. But I have not mentioned his two other major intellectual enthusiasms, beside the scientific work I have described. One was alchemy – not then distinct from what we now call chemistry. Alchemists of course are remembered mainly for their efforts to create gold, and hence fabulous wealth – but this was not Newton’s aim. The subject was full of occult knowledge and arcane secrets, and for Newton this was one route to a revelation of the universe’s true, unknown nature, and he pursued it assiduously, having a vast library and spending at least as much time on it as his work in what is for us mainstream science. It also had a practical outcome, since he developed from it a thorough knowledge of metallurgy which he put to use in his work in administering the coinage.

His third passion was his research into the history of Christianity and the church. Newton was a deeply pious man, more in a private than a public way. This was partly because Newton’s particular faith was a heretical one, and would have been dangerously so in the earlier part of his life, when England was ruled by the Catholic James II. Newton was obsessed with the now largely forgotten controversy concerning the opposed church fathers Arius and Athanasius. Arian doctrine (not to be confused with the ‘Aryan’ 19th and 20th century racial dogma) held that Christ was a subordinate entity to God, and denied the Holy Trinity taught by Athanasius, and adopted by the mainstream church. For Newton, Arianism was the true faith, whose origins, he believed, could be traced back beyond the Christian era, and was the only way to approach the reality of God.

It almost goes without saying that these three obsessions were not independent of one another in Newton’s mind. For him they all served the same purpose – to uncover the mysteries of the universe and the nature of God. Gravitation was a controversial topic at the time, in virtue of its assertion that one body could act upon another without physical contact. (Perhaps a sort of parallel with the issues we have today with the phenomenon of quantum entanglement.) For Newton, the concept was all of a piece with the mysterious action of God – a window into the nature of reality.

Of course Newton’s scientific conception of the universe has now been radically modified by the twentieth century developments of relativity and quantum theory. But there’s a more fundamental sense in which we are still Newtonians: his towering achievement was the scheme of the universe as an integrated whole, governed by mathematically described laws (with some honours also going to his predecessor Galileo). This is the framework within which all our modern scientific endeavours take place.

Estrangement

The brothers as they appear in the book (faces obscured)

The brothers as they appear in the book (faces obscured)

So why the note of uncertainty in my title? To explain this I want to digress by describing an image which came into my mind while thinking about it. A few years back, the local people where I live produced a book about our village’s history. An appeal went out for any period photographs that might be borrowed to illustrate the book, and there was a big response. The organiser gave me the task of scanning in all these photos for the book’s publishers, and one them sticks in my mind. It showed two brothers who were local characters during the 1930s standing in a garden, and a closer examination showed that it had been taken at a wedding. They are wearing their best suits, and are sporting buttonholes. Why is the setting not so immediately obvious? Because the photo had been crudely ripped in two down the middle, with both the brothers in the left half. We can see that one of them is the groom, and that the missing right half contained his bride – only her hand is visible, nestling in the crook of his arm.

I found this mute evidence of some anguished estrangement from the past rather moving. What had seemed like a happy union at the time now had the feminine half of it expunged by someone who was determined that she no longer deserved any place in their thoughts. Yes – you get my drift. The enterprise of science now prefers to go it alone along its own, masculine, analytical path, with any attendant mystery ripped out of the picture, leaving only the barest hint. (See my thoughts on atheism in the previous post.)

It’s worth returning to Newton’s own imagery and repeating the often-quoted passage he wrote towards the end of his life:

I do not know what I may appear to the world, but to myself I seem to have been only like a boy playing on the sea-shore, and diverting myself in now and then finding a smoother pebble or a prettier shell than ordinary, whilst the great ocean of truth lay all undiscovered before me.

Although he was an arrogant man in his personal dealings (and the opening phrase here hints at how conscious he is of his reputation), I don’t believe this to be mock-modesty. He also was genuinely pious, and all too aware of the mystery surrounding what he had learned of the universe. Today, we look forward to finishing the task of enumerating the pebbles and shells, and are happy to ignore the ocean. In this sense we are more arrogant than he was, and that’s the source of my doubt as to whether we are really Newtonians now.


Ackroyd, Peter: Newton. Vintage Books, 2007

Science and Emotion

Commuting days until retirement: 507

heretics-coverFollowing my comments 3 posts ago, my reading on the train over the last week has been The Heretics: Adventures with the Enemies of Science, by Will Storr. Despite booming bishops and other distractions, I found it intensely readable, and I think it pretty much fulfilled my expectations.

The subtitle about ‘the Enemies of Science’ is perhaps a little misleading: this is not primarily an exercise in bashing such people – you’ll find plenty of other books that do that, and any number of internet forums (and of course I had a go at it myself, in the post I mentioned). It’s an attempt to understand them, and to investigate why they believe what they do. Storr does not treat the enemies of science as being necessarily his personal enemies, and it emerges at the same time that the friends of science are not always particularly – well, friendly.

I was going to say that it’s a strength of the book that he maintains this approach without compromising his own adherence to rationality, but that’s not strictly true. Because another strength is that he doesn’t attempt to adopt a wholly godlike, objective view. Rather, he presents himself, warts and all, as a creature like the rest of us, who has had to fight his own emotional demons in order to arrive at some sort of objectivity. And he does admit to experiencing a kind of bewitchment when talking to people with far-out beliefs. ‘They are magic-makers. And, beneath all that a private undercurrent: I feel a kind of kinship with them. I am drawn to the wrong.’

It’s a subtle approach, and a difficult path to tread, which invites misunderstanding. And one critic who I believe misunderstands is Mark Henderson in the Guardian who, while admiring aspects of Storr’s work, finds the book ‘disappointing and infuriating….  He is like the child who still wants to believe in Father Christmas, but who is just old enough to know better. Life would be more magical, more fun, if the story were true.’  Well here I think Henderson is unwittingly stating the central thesis of Storr’s book: that as humans we are above all myth makers – we have a need to see ourselves as a hero of our own preferred narrative.

This idea appeals to me in particular because it chimes in with ideas that I have come across in an evening course I am currently following in the philosophy of emotion. Writers on emotion quite naturally classify emotion into positive (happiness, love, hope, excitement) and negative (sadness, hatred, anger, fear, etc). The naive assumption is that we welcome the former and avoid the latter if we can. But of course the reality is far more nuanced, and more interesting than that. In the first place is the need of many to pursue dangerous and frightening pursuits, and then of course the undoubted delights of sado-masochism. But much closer to home is the fact that we flock to horror films and emotionally harrowing dramas – we love to be vicariously frightened or distressed. Narrative is our stock in trade, and (as the increasingly popular creative writing courses preach) unless there’s plenty of conflict and resolution, nobody will be interested.

Mythology

We all have our own unspoken narratives about our own place in the world, and in most cases these probably cast us in a more heroic light than the accounts others might give of us. They help to maintain our emotional equilibrium, and in cases where they are nullified by external circumstances, depression, despair and even suicide may result. And of course with the internet, bloggers like me can start to inflict our own narratives on a bigger potential audience than ever (I wish). And earlier theories of the world are of course entirely myth and narrative laden, from ancient Greek culture to the Bible and the Koran. Our cultural heritage fits around our instinctive nature. (As I tap this passage into my phone on the train, the man immediately next to me is engrossed in a Bible.)  How difficult, then, for us to depart from our myths, and embrace a new, evidence based, and no longer human-centred, story of of creation.

t-rexStorr encounters one of those for whom this difficulty is too great. John Mackay is a genial, avuncular Australian (there’s plenty of footage on You Tube) who has been proselytising worldwide on behalf of the creationist story for some years. In the gospel according to Mackay, everything that seems evil about nature stems from the fall of Adam and Eve in the Garden of Eden. Before this there were no thorns on plants, men lived happily with dinosaurs and nobody ever got eaten – all animals were vegetarians. A favourite of mine among his pronouncements is where Storr asks him why, if Tyrannosaurus Rex was a vegetarian, it had such big, sharp teeth. The answer (of course) is – water melons.

On You Tube I have just watched Mackay demonstrating from fossil evidence that there are plants and animals which haven’t evolved at all. This is a fundamental misunderstanding of Darwin: if organisms aren’t required by external forces to adapt, they won’t. But of course on Mackay’s time scale (the Earth is of course six thousand years old) there wouldn’t have been enough time for fossils to form, let alone for anything to evolve very much. The confusion here is manifold. For his part, Storr admits to having started out knowing little about evolution theory or the descent of man, and to having taken the scientific account on trust, as indeed most of us do. But his later discussions with a mainstream scientist demonstrate to him how incomparably more elegant and cogent the accepted scientific narrative is.

How objective can we be?

Henderson charges Storr with not giving sufficient recognition to the importance of the scientific method, and how it has developed as a defence of objective knowledge against humanity’s partial and capricious tendencies. But Storr seems to me to be well aware of this, and alert to those investigators whose partiality throws doubt on their conclusions. ‘Confirmation bias’ is a phrase that runs through the book: people tend to notice evidence which supports a belief they are loyal to, and neglect anything that throws doubt on it. A great example comes from a passage in the book where he joins a tour of a former Nazi concentration camp with David Irving, the extreme right wing historian who has devoted his life to a curious crusade to minimise the significance of the Holocaust, and exculpate Hitler. Storr is good on the man’s bizarre and contradictory character, as well as the motley group of admirers touring with him. At one point Irving is seeking to deny the authenticity of a gas chamber they are viewing, and triumphantly points out a handle on the inside of the open door. He doesn’t bother to look at the other side of the door, and but Storr does afterwards, and discovers a very sturdy bolt. You are led to imagine the effect of a modus operandi like this on the countless years of painstaking research that Irving has pursued.

But should we assume that the model of disinterested, peer-reviewed academic research we have developed has broken free of our myth-making tendencies? Those who are the most determined to support it are of course themselves human beings, with their own hero narratives. Storr attends a convention of ‘Skeptics’ (he uses the American spelling, as they themselves do) where beliefs in such things as creationism or belief in psychic phenomena are held up to ridicule. He brings out well the slightly obsessional, anorak-ish atmosphere of the event. It does, after all, seem a little perverse to devote very much time to debunking the beliefs of others, rather than developing one’s own. It’s as if the hero narrative ‘I don’t believe in mumbo-jumbo’ is being exploited for purposes of mutual and self-congratulation. The man who is effectively Skeptic-in-Chief, the American former magician James Randi, is later interviewed by Storr, and comes across as arrogant and overbearing, admitting to sometimes departing from truthfulness in pursuit of his aims.

If scientists, being human, are not free of personal mythology, could this work against the objectivity of the enterprise? I think it can, and has. Some examples come to mind for me. The first is Ignaz Semmelweis, a Hungarian physician in the early to mid 19th century. In the days before Pasteur and the germ theory of infection, he was concerned by the number of maternal deaths in his hospital from what was called ‘puerperal fever’. This seemed to be worse in births attended by doctors, rather than midwives. In a series of well executed investigations, he linked this to doctors who had come to the maternity ward after performing post-mortems, and further established that hand-washing reduced the incidence of the disease. But the notion that doctors themselves were the cause did not meet with approval: an obvious clash with the hero narrative. Semmelweis’s findings were contemptuously rejected, and he later suffered a breakdown and died in an asylum. A similar example is the English Victorian physician John Snow, who in a famous investigation into cholera in Soho, conclusively showed it to be spread via a water-pump, in contradiction with the accepted ‘miasma’ theory of airborne infection. He further linked it to pollution of the water supply by sewage – but something so distasteful was a conclusion too far for the Victorian narratives of human pride and decency, and the miasma theory continued to hold sway.

Both these examples of course come from medicine, where conclusive and repeatable results are harder to come by, and easier to contest. So let’s go to the other extreme – mathematics. You would think that a mathematical theorem would be incontrovertible, at least on grounds of offending anyone’s personal sensibilities. But around the turn of the 20th century Georg Cantor’s work on set theory led him to results concerning the nature of infinity. The consequent attacks on him by some other mathematicians, often of the most ad hominem kind, calling him a charlatan and worse, showed that someone’s personal myths were threatened. Was it their religious beliefs, or the foundations of mathematics on which their reputations depended? I don’t know – but Cantor’s discoveries are nowadays part of mainstream mathematics.

Modern heresy

My examples are from the past, of course: I wanted to look at investigators who were derided in their time, but whose discoveries have since been vindicated. If there are any such people around today, their vindication lies in the future. And there is no shortage of heterodox doctrines, as Storr shows. Are any of them remotely plausible? One interesting case is that of Rupert Sheldrake, to whom Storr gives some space. He has an unimpeachable background of education in biology and is a former Cambridge fellow. But his theory of morphic fields – mysterious intangible influences on biological processes – put him beyond the pale as far as most mainstream scientists are concerned. Sheldrake, however, is adamant that his theory makes testable predictions, and he claims to have verified some of these using approved, objective methods. Some of them concern phenomena known to popular folk-lore: the ability to sense when you are being stared at, and animals who show correct anticipation of their absent owners returning home. I can remember when we played games with the former when I was at school – and it seemed to work. And I have read Sheldrake’s book on the latter, in which he is quite convincing.

But I have no idea whether these ideas are truly valid. Storr tells of a few cases where regular scientists have been prepared to try and repeat Sheldrake’s results with these phenomena, but most degenerate into arcane wrangling over the details of experimental method, and no clear conclusions emerge. What is clear to me is that most orthodox scientists will not even consider, publicly, such matters, since doing so is seen as career suicide. Is this lack of open-mindedness also therefore a lack of objectivity? Lewis Wolpert is quoted in Storr’s book: ‘An open mind is a very bad thing – everything falls out’, a jibe repeated by Henderson. You could retort that the trouble with a closed mind is that nothing gets in.  There is a difficult balance to find here: of course a successful scientific establishment must be on its guard against destructive incursions by gullibility and nonsense.  On the other hand, as we have seen, this becomes part of the hero narrative of its practitioners, and may be guarded so jealously that it becomes in some cases an obstacle to advances.

Sheldrake tells Storr that his theories in no way destroy or undermine established knowledge, but add to it. I think this is a little disingenuous of him. If we have missed something so fundamental, it would imply that there is something fundamentally wrong about our world-view. Well of course it would be arrogant to deny that there is anything at all wrong with our world-view (and I think there is plenty – something to return to in a later post). But Storr’s critic Henderson is surely right in holding that, in the context of a systematically developed body of knowledge, there is a greater burden of proof on the proponent of the unorthodox belief than there is on the the opponent. Nevertheless, I agree with Storr that the freedom to promote heterodox positions is essential, even if most of them are inevitably barmy. It’s not just that, as Storr asserts near the end of the book, ‘wrongness is a human right’. Occasionally – just very occasionally – it is right.