Categories
Creativity

The Secret Language of Creativity

It has been estimated that the vast majority of human communication is not only non-verbal it is also non-conscious.

We often think about non-verbal communication simply in terms of body language, but in fact, it plays a major role in all forms of human communication… particularly in areas such as the visual arts, cinema, theatre, music and dance.

Research suggests that the unconscious mind processes information far more quickly and efficiently than the conscious.

Indeed, it has been shown that the conscious mind is in fact a small bandwidth data processor, which can only process around 40 bits of information per second; whilst the unconscious mind is, effectively, a massive­­­ bandwidth data processor capable of processing over 13 million bits of information per second.

And since it processes over 13 million bits of information per second, our unconscious mind experiences much that is beyond the comprehension of the conscious mind.

Creative ideas come in all shapes and sizes, and we often refer to those ideas that have the greatest cultural impact as being ‘Big Ideas’. All creative ideas are processed in the unconscious before they surface in the conscious mind, but ‘big ideas’ tend to require a proportionally larger amount of unconscious processing.

One of the first to recognise the importance of the unconscious in the creative process was the French mathematician and theoretical physicist Henri Poincaré (1854 –1912). Through a process of trial and error, Poincare found that consciously considering a complex problem, would only get him so far and that it was only after putting the task aside for a while, that the answer would come to him. Saying ‘it is by logic that we prove, but by intuition that we discover’, Poincaré developed a daily routine based on this insight, consciously working for just two hours in the morning and two hours in the evening, giving over the remainder of the day for the unconscious to mull things over.

Using this technique, and by harnessing the processing power of the unconscious mind, Poincare was able to imagine mathematical structures of the most extraordinary complexity.

Indeed, his ideas went on to form the basis of Complexity Theory, a branch of mathematics that could only begin to be visualised in the late 1970’s when computers were finally invented with enough processing power to bring Poincaré’s ideas to life.

By harnessing the power of the unconscious mind, Poincare was capable of imagining mathematical structures that could only be visualised half a century after his death, with the arrival of computers with significant processing power in the late 1970’s. Pictured is one such computer generated image known as a ‘Mandelbrot Set’.

Poincaré’s ideas about creativity were to influence an entire generation, including such important Modernist figures as Pablo Picasso and Albert Einstein, but one person who was particularly influenced by Poincaré’s understanding of the role of the unconscious in the creative process, was an English social psychologist called Graham Wallas (1858 – 1932).

In 1926, Wallas published a book called The Art of Thought, in which he presented a model of the creative process, based on Poincaré’s method, which consisted of four consecutive, stages:

The Wallas Four-Stage Creative Process: 1) Preparation: The stage at which the problem is consciously ‘investigated in all directions’ 2) Incubation: The stage at which the problem is put to one side in favour of unconscious processing 3) Illumination: The stage at which the creative idea surfaces into conscious awareness 4) Verification: The stage at which the idea is consciously verified, and then applied.

With minor modifications, this four-stage process of PREPARATION, INCUBATION, ILLUMINATION and VERIFICATION has since been widely recognised as the standard model for the creative process.

As Wallas’s four stage process indicates, creative ideas appear after a conscious period of PREPARATION, followed by an unconscious period of INCUBATION; and the amount of time and effort that has been  devoted to the problem at these two critical stages is usually reflected in the scale, complexity and ultimately value of the result.

When evaluating creative ideas, I believe, therefore, that it is useful to place them on a spectrum with Low bandwidth ideas at one end of the scale and High-Bandwidth ideas at the other… depending on how much non-conscious processing power has been brought to bear on the problem.

An extremely Low-Bandwidth idea is one that has been quickly produced by the 40 bits per second conscious mind whilst, at the other end of the scale, an extremely High-Bandwidth idea is one produced over time by the 13 million bits per second non-conscious mind.

bandwidth slides.012

In some instances the creative process may simply represent the investment of the odd hour here or there, in others it may represent weeks, or even months of work, whilst, in truly exceptional circumstances, the realisation of a creative idea might represent the work of a lifetime.

So, for example, the kind of work that does not demand much unconscious processing – such as creativity within a pre-determined template, like a ‘Painting by Numbers’ kit or a simple tweak to an existing redipe – can be characterized as ‘Low-Bandwidth Creativity’.

Whereas the kind of work that draws on the wider resources of the unconscious – such as a novel, a screenplay, a great painting, or a symphony – can all be characterized as ‘High-Bandwidth Creativity’.

BITTY VERMEER

Thus when we experience High-Bandwidth Creativity in the form of a great movie or a profoundly moving piece of music, it is largely the subliminal elements, the ones that speak to us ‘below the radar’ of consciousness that decide how we actually experience the work.

In this way, ‘Big Ideas’ not only require a greater amount of unconscious processing power, they also communicate with the unconscious mind of the audience on a much broader bandwidth, and therefore have a greater cultural impact.

At their best, ‘Big Ideas’ are those which communicate across the full spectrum of consciousness. Addressing not just the tip of the iceberg that is the conscious mind, but also the vast submerged part that is the unconscious mind.

It is only in this manner that we can begin to communicate with the whole of our being, and to hope to express what it is to be fully human.

Categories
Creativity

A portrait of the artist,

Like so many creative ideas, it all began by chance.

It was 1971. Two photographs had fallen on to the floor of his studio, and David Hockney, was staring down at them, transfixed by the fact that they had landed in a way that suggested an intriguing relationship between them…

The first of these two photographs featured a figure – Hockney’s ex-lover Peter Schlesinger, with his head tilted slightly, looking down at something on the ground. The second a figure swimming in a pool under the surface of the water. And whilst the first was a literal photographic representation, in the second, the body of the submerged swimmer was fragmented into a set of beautiful abstract shapes by the ripples on the surface of the water.

And the way in which the photographs had fallen seemed to suggest that the first figure was peering over the edge of the pool at the swimmer in the water below.

This composition was to become the basis of one of the most widely recognised and well-loved of David Hockney’s paintingsa painting called ‘Portrait of an Artist (Pool with Two Figures)’. When this painting went to auction at Christies in New York on 15 November 2018 it sold for $90.3 millionmaking it the most expensive work of art by a living artist ever sold at auction.

Britain’s first openly gay artist, Hockney was born in 1937 in the bleak post-industrial landscape of Northern England, and raised during a time of rationing, sexual repression and rampant homophobia.

Unsurprisingly, California proved an irresistible attraction for the young artist. As he put it later: ‘I came to Los Angeles for two reasons. The first was a photo by Julius Shulman of Case Study House #21, and the other was AMG’s Physique Pictorial.’

Shulman’s photo featured a mid-century modernist building with a swimming pool that served to exaggerate the graphic quality of the steel and glass structure, whilst Physique Pictorial was America’s first and foremost gay magazine featuring the extravagantly homoerotic photography of Bob Mizer.

‘I came to Los Angeles for two reasons. The first was a photo by Julius Shulman of Case Study House #21, and the other was AMG’s Physique Pictorial.’

DAVID HOCKNEY
Photo by Julius Shulman of Case Study House #21
Physique Pictorial is generally considered to be America’s first gay magazine featuring the openly homoerotic photography of Bob Mitzer

Together, these two visual elements – the male body, and the domestic swimming pools of Los Angeles – would define the work of David Hockney work for many years to come.

Hockney had moved to California in 1963. and flying in to Los Angeles, he had been struck by the sheer number of these domestic swimming pools. As he later told Diane Hanson in 2009 ‘I looked down to see blue swimming pools all over, and I realised that a swimming pool in England would have been a luxury, whereas here they are not.’

The move was to have a profound effect on Hockney – in much the same way that moving to Provence had had such a profound effect on the painting of Vincent Van Gogh a century earlier – the swimming pools of Los Angeles being Hockney’s equivalent of the Dutch artist’s sunflowers.

And little wonder.

These sparkling and shimmering waters, that stretch out beneath the immaculate blue of cloudless Californian skies, seemed to symbolise everything that the boy who had grown up in the black-and-white bleakness of post-war Bradford had longed for.

For the first time, David Hockney found himself, quite literally, in his element.

This element was not, however, without its technical challenges.

As Hockney says: ‘It is an interesting formal problem; it is a formal problem to represent water, to describe water, because it can be anything. It can be any colour and it has no set visual description’… ‘[The pool paintings] were about the surface of the water, the very thin film, the shimmering two-dimensionality.’

But this was also a time of massive technical innovation for Hockney, for it was at this point that he switched from traditional oil paints to the recently developed acrylics which offered much faster drying times. At the same time he also began to use photography as part of his practice and this would gradually lend his paintings greater and greater representational realism. Thus in order to try and capture ‘the shimmering two-dimensionality‘ of the surface of these LA pools Hockney began to experiment various media including acrylics, watercolours, crayons and lithographs, as well as his later technique of pressing dyed, wet paper pulp into sheets of paper.

Hockney’s earliest California works from 1964 depict water as inky splashes of blue and grey, but by 1996 – with paintings like Peter Getting Out of Nick’s Pool and Sunbather – he had he began to develop a more stylised visual language in which he uses interlocking tubular shapes in a ‘Pop Art’ style to suggest the shimmering surface of the water.

However, perhaps the most famous of these experiments, came the following year with ‘A Bigger Splash’ painted in 1967. Here a pool that reflects a sky made up of a large flat field of cobalt blue is splashed with a Jackson Pollock style action-painting plume of pure white paint.

It was around this time that David Hockney began teaching at UCLA. And it was here that he first met Peter Schlesinger who was a student there at the time They began a long lasting relationship, and Schlesinger not only became his lover but also his inspiration and favourite model, appearing in countless paintings over the next few years. Later Schlesinger relocated with Hockney to London, where he went on to study at the Slade School of Art.

By the early 1970’s however, the relationship was falling apart.

The 1973 documentary film A Bigger Splash, directed by Jack Hazan, therefore, not only covers the long lingering breakup with Peter Schlesinger, from 1970 to 1973 but also the creation of ‘Portrait of an Artist (Pool with Two Figures)’ between 1971 and 1972.

A Bigger Splash (1974). A Film by Jack Hazan. Trailer.

As Hazan’s film shows, David Hockney actually made two separate attempts at the painting that was to become known as ‘Portrait of an Artist’.

The first, he started in 1971, having been inspired by discovering the chance juxtaposition of two photographs that had fallen onto the studio floor: ‘One was of a figure swimming underwater and therefore quite distorted… the other was a boy gazing at something on the ground,’ Hockney said later… ‘The idea of painting two figures in different styles appealed so much that I began the painting immediately.’

However the two photographs did not sit well together, and after months of working and reworking the image Hockney was still not happy with the composition, and eventually destroyed the painting.

In April 1972 Hockney needed new work for a one man show at the André Emmerich Gallery in New York, and although the show was due to open in just four weeks time he resolved to return to the theme of ‘Portrait of an Artist’ .

However Hockney also now realised that he needed to shoot new reference material for the composition that had been suggested by the two photographs he had found on his studio floor. He would therefore now solve the technical problems presented by his previous attempt at ‘comping’ together two different photographs, by shooting the whole scene from one POV, on one camera with the same lens.

In order to do this, he decided to return to a pool location he knew well. Significantly, this was not in Los Angeles but in the Massif des Maures, the mountains that overlook Saint-Tropez in the South of France.

Here in a celebrity hideaway called Le Nid du Duc, Hockney had remembered that there was an idyllic pool looking out over the mountains. Against this beautiful setting, he now staged a series of reference photographs using an assistant and a friend as models that he could use as source material for the final painting.

Preparatory photograph for Portrait of an Artist (Pool with Two Figures), at Le Nid-du-Duc, 1972 © David Hockney

Returning to London he then did a second photographic shoot in Kensington Gardens, reversing the angle of the morning light, but matching the POV and using the same camera lens to capture the figure of ‘the artist’.

For this he sasked Peter Schlesinger to model for him one last time – wearing the same pink jacket his assistant had worn in the South of France.

Hockney taking photographs of his ex-partner, Peter Schlesinger, in Kensington Gardens, from which he created the pink-jacketed figure standing at the pool’s edge. Film still from A Bigger Splash, 1974. Photo: Jack Hazan

Using these elements for reference, Hockney then set about the physical act of painting. He worked a full 18 hours a day for two weeks, until he eventually finished the artwork the night before it was due to be shipped to New York. (One of the benefits of working in acrylics is, of course, their rapid drying time).

Hockney worked 18 hours a day non-stop for two weeks to finish his painting, finally completing it the night before it was due to be shipped to New York. Film still from A Bigger Splash, 1974 . Photo: Jack Hazan. Artwork: © David Hockney

‘I must admit I loved working on that picture,’ he would later recall of that fortnight, ‘working with such intensity; it was marvellous doing it, really thrilling.’

Given the extraordinary circumstances, under which the painting was created, and the way in which so much of this was documented in Jack Hazan’s film, there have been many who see the work only in the context of Hockney’s failing relationship with Peter Schlesinger.

However I think there is more to this painting than that.

Indeed, in an interview in 2018 Schlesinger told the Observer: “It’s an amazing picture, and it contains his two most iconic genres in one picture, swimming pools and double-portraiture, but I can’t speak to its emotional element because I don’t think it is emotional. There’s the figure of me standing and the figure in the water differently painted. It was a conceptual problem. I don’t even think it’s a portrait of me, really.”

What had initially excited Hockney about the project suggested by the two separate photographs was the opportunity they afforded him to explore two different representations of reality – one highly literal, the other highly distorted – both within the same painting.

This reflected the wider tension in 20th century painting — as well as in his own work – between figurative art and more abstract forms of expression.

In the painting, we see – as promised in the title – an artist, looking over the edge of an outdoor swimming pool, considering two of Hockney’s favourite subjects – the male body and water.

As Hockney said in an interview with Fran Morrison in 1980:

‘Water, the idea of drawing water, is always appealing to me. If it’s clear water anyway, transparent water. You can look on it, through it, into it, see it as volume, see it as surface…the idea of representing it has always rather fascinated me and I keep going back to it

In ‘Portrait of an Artist’, Hockney avoids the highly specific domestic architecture of mid-century Los Angeles – something that we associate with so many of his pool paintings -replacing it here with a more natural landscape that creates a more mythic, universal setting for the two figures.

The composition of this landscape has a exquisite rhythmic feel to it, formed as it is from a series of interlocking triangles. These triangles subliminally reflect the main triangle in the composition created by the vertical figure of the artist and the horizontal figure of the swimmer.

This main triangle, is designed to emphasise the central drama of the painting – the relationship between the artist and the mysterious figure swimming toward him.

Equally important to note is that the two figures exist in two different and distinct realities.

The artist stands vertical at the edge of the pool. He is clothed and cannot enter the water. Whilst the swimmer is a horizontal figure fully submerged in the water. The artist is clearly conscious of the swimmer. The swimmer, we feel, is not necessarily conscious of the artist.

And perhaps this is the key to understanding the real emotional power of this painting. Because at some level, the figure of the artist seems to represents the conscious mind and the swimmer the unconscious.

Whatever ideas you bring to this painting, however, the obvious question it seems to pose is ‘What happens next?’. The submerged swimmer is clearly a breath away from reaching the end of the pool at which point, will he break the surface and come face to face with the artist?

And, for me at least, this is the central drama of Portrait of an Artist (Pool with Two Figures), because whatever else it is, it always feels like an articulation of that extraordinary moment just before something new – something that has been slowly emerging out of the depths of the artists unconscious – breaks the surface of the water and emerges into the clear air of consciousness.

David Hockney (b. 1937), Portrait of an Artist (Pool with Two Figures), painted in 1972. 
Acrylic on canvas. 84 x 120 in (213.5 x 305 cm).
Portrait of an Artist (Pool with Two Figures), painted in 1972 by David Hockney.
Acrylic on canvas. 84 x 120 in (213.5 x 305 cm). 

Categories
Creativity

Death by Powerpoint

It is thought that the way in which Powerpoint forces information to be structured was - at least in part - responsible for the Columbia Space Shuttle disaster

Originally designed as a sales presenter program, the rigid way in which Powerpoint forces information to be structured means that it can be a dangerous liability when used in complex situations where comprehension can mean the difference between life and death.

Indeed it is widely thought that the rigid structure of Powerpoint that was – at least in part – responsible for the Columbia Space Shuttle disaster on February 1, 2003

When Marshall McLuhan, declared that “the medium is the message”, he surreptitiously slipped a remarkable and revolutionary new idea into late 20th century popular culture.

This seemingly innocent idea signalled the realisation that the very act of reading a text is often of greater significance than the information contained within the text itself.

And it is this insight that can help us understand the problem with Microsoft Powerpoint; the presenter programme which has become notorious, as much for its characteristics as a medium, as for its ability to communicate a message.

Powerpoint was invented in the mid-1980s by Robert Gaskins, a visionary entrepreneur, who spotted the opportunity for a dedicated sales presenter software programme offered by the the emergence of graphics-based personal computers. and the vast – but largely unrecognised – market for preparing business slides.

As a sales tool, Gaskins designed Powerpoint, to give the presenter the ‘power’ to make their ‘point’ in a structured, hierarchical fashion.

However, Powerpoint presentations are often linked less by internal logic or narrative coherence, than by the illusion of hierarchical structure created by the program.

And it is this combination of the lack of a coherent narrative, and an unrelenting hail of bullet points, that has made PowerPoint notorious for its unique ability to reduce audiences to catatonic states of mind-numbing boredom.

But the real problem these days is that Powerpoint is no longer restricted to use in sales pitches. Indeed you will now find Powerpoint presentations anywhere that one person needs to address others in a formal setting.

This includes organisations like NASA, and the US military. Where Powerpoint’s inability to deal with complexity and it’s rigid hierarchy have meant that ‘Death by Powerpoint’ is no longer simply a metaphor.

PowerPoint

Whether it be for a high level Joint Staff meeting in Washington or for a platoon leader’s pre-mission combat briefing of his men in a remote valley in Afghanistan, the amount of time spent creating PowerPoint presentations, has made it a running joke among many members of the US Military.

But apart from the time wasted, and behind all the jokes about PowerPoint Rangers, there have been some very serious concerns among senior staff that the program itself stifles discussion, critical thinking and thoughtful decision-making.

In an article by in the New York Times entitled “We Have Met the Enemy and He Is PowerPoint”, journalist Elisabeth Bumiller describes how the program has become deeply embedded in a military culture that has come to rely on PowerPoint’s hierarchical ordering brings the illusion of order to a confused world.

““PowerPoint makes us stupid,” Gen. James N. Mattis of the Marine Corps, the Joint Forces commander, said in a speech at a military conference in North Carolina. (The speech was delivered without PowerPoint.)

A part from being sujected to constant attack from the Taliban the US Military is also subjected to daily Powerpoint presentations. Photo Credit: Bryan Denton for The New York Times
Many senior US Military figures are concerned that daily Powerpoint presentations  are curbing discussion, critical thinking and informed decision-making.
Photo Credit: Bryan Denton for The New York Times

And Brig. Gen. H. R. McMaster, who banned PowerPoint presentations when he led the successful effort to secure the northern Iraqi city of Tal Afar in 2005, followed up at the same conference by likening PowerPoint to an internal threat. “It’s dangerous because it can create the illusion of understanding and the illusion of control,” General McMaster said…“Some problems in the world are not bullet-izable.”

“It’s dangerous because it can create the illusion of understanding and the illusion of control…Some problems in the world are not bullet-izable.”

General McMaster. us army

But this is to miss the point. The real strength of Powepoint is as a propaganda tool… as journalist Elisabeth Bumiller puts it, quoting one Senior US Military, Powerpoint is ” handy when the goal is not imparting information, but the opposite, as in briefings for reporters…. The news media sessions often last 25 minutes, with 5 minutes left at the end for questions from anyone still awake… Those types of PowerPoint presentations, are known as “hypnotizing chickens.””

Perhaps the most infamous example of the US Military “hypnotizing chickens” was when General Colin Powell, then acting as As Secretary of State for the Bush administration, made his pitch to the United Nations Assembly for war in the Middle East, (“Go up there and sell it” Vice President Dick Cheney is reported to have said to him beforehand) with a highly imaginative presentation of the “evidence” for the existence weapons of mass destruction in Iraq.

Edward Tufte is the world’s leading thinker in the visual display of information. Since the publication of “The Visual Display of Quantitative Information”, in 1982, he has consistently championed the importance of good information design.

In the wake of the tragic 2003 Columbia Space Shuttle disaster, Tufte published an article in Wired magazine entitled: “PowerPoint Is Evil” and followed this up with a more scholarly, less dramatically titled pamphlet, “The Cognitive Style of PowerPoint,”

Here Tufte argued that the programme encourages “faux-analytical” thinking that favors the slickly produced “sales pitch” over the sober exchange of information.

“Imagine a widely used and expensive prescription drug that promised to make us beautiful but didn’t. Instead the drug had frequent, serious side effects: It induced stupidity, turned everyone into bores, wasted time, and degraded the quality and credibility of communication. These side effects would rightly lead to a worldwide product recall.

Yet slideware -computer programs for presentations… is everywhere: in corporate America, in government bureaucracies, even in our schools. Several hundred million copies of Microsoft PowerPoint are churning out trillions of slides each year. Slideware may help speakers outline their talks, but convenience for the speaker can be punishing to both content and audience. The standard PowerPoint presentation elevates format over content, betraying an attitude of commercialism that turns everything into a sales pitch.”

In “The Cognitive Style of PowerPoint” Tufte’s analysis reveals that Powerpoint has a number of specific design faults which render it incapable of communicating certain types of information.

Firstly, Tufte says, it is designed to guide and to reassure the presenter, rather than to communicate with the audience. Secondly, the outliner function causes ideas to be arranged in an unnecessarily deep hierarchy, itself subverted by the need to restate the hierarchy on every slide. (What’s more, the audience is forced to follow the presenter’s thinking in lockstep linear progression through this hierarchy – whereas with handouts, readers could browse and relate items at the same time). And thirdly, and perhaps most importantly, it has a natural tendency to oversimplify thinking, with ideas being squashed into bulleted lists, and stories with beginning, middle, and end being turned into a collection of disparate bullet points.

A central part of this analysis was the Columbia Space Shuttle disaster which Tufte demonstrates, was caused, in part, by the use of Powerpoint.

The seven crew members who died aboard the Columbus were: Rick Husband, Commander; William C. McCool, Pilot; Michael P. Anderson, Payload Commander; David M. Brown, Mission Specialist 1; Kalpana Chawla, Mission Specialist 2; Laurel Clark, Mission Specialist 4; and Ilan Ramon, Payload Specialist 1.
Death by Powerpoint? The seven crew members who died aboard the Columbus following an over reliance on Powerpoint for communicating critical information. They were: Rick Husband, Commander; William C. McCool, Pilot; Michael P. Anderson, Payload Commander; David M. Brown, Mission Specialist 1; Kalpana Chawla, Mission Specialist 2; Laurel Clark, Mission Specialist 4; and Ilan Ramon, Payload Specialist 1.

The Columbia Space Shuttle was destroyed on February 1, 2003 while attempting to re-entering the atmosphere after a 16-day scientific mission. A hole was punctured in the leading edge on one of Columbia’s wings, made of a carbon composite. The hole had formed when a piece of insulating foam from the external fuel tank peeled off during the launch and struck the shuttle’s wing. During the intense heat of re-entry, hot gases penetrated the interior of the wing, destroying the support structure and causing the rest of the shuttle to break apart.

Exhibit A in Tufte’s analysis is a PowerPoint slide which you can see below and in more detail here, and which had been presented to NASA senior managers in January 2003, while the space shuttle Columbia was in space and they were trying to understand the risk posed by the damage caused by the piece of insulating foam to the shuttle wings.

Unfortunately, as you can see from Tufte’s analysis, the critical piece of information – the piece of information that could have prevented the disaster – is buried so deeply in the rigid PowerPoint format that it is effectively useless.

“It is easy to understand how a senior manager might read this PowerPoint slide and not realize that it addresses a life-threatening situation,” the Columbia Accident Investigation Board concluded, citing Tufte’s work, and strongly criticized the culture within NASA in which, it said, “the endemic use of PowerPoint” had been substituted for rigorous technical analysis.

It is now more than twenty-five years since Powerpoint was first launched as a sales presentation aid. It is now installed on over one billion computers around the world, and its use is no longer confined to the sales pitch. It is used everywhere from church services to military briefings, from schools to hospitals.

And you don’t have to be a NASA rocket scientist to see what a dangerous idea this is…

Categories
Creativity

The Hubble images – deep space or deep fake?

Spectacular images with names like ‘Mystic Mountain’ and the ‘Twin Pillars of Creation’ that have been produced by the Hubble Heritage Project have redefined how we imagine the universe around us.

And yet, on closer examination, it would seem that the way in which these images are constructed, says more about the culture that created them, than it does about the phenomena that they appear to portray.

Because what most people do not realise is that many of the dramatic views that are published by the Hubble Heritage Project bear little resemblance to anything that can ever be seen in reality, anywhere in the universe.

‘We are all in the gutter, but some of us are looking at the stars’ as Lord Darlington says in Lady Windermere’s Fan.

And if you look up into the night sky for long enough, you may catch sight of a mysterious glinting object sailing silently across the face of those stars.

This is probably a satellite; one of more than a thousand such objects, which currently orbit our planet.

Most satellites are faced back down towards the surface of the Earth, in order to perform such mundane tasks as providing weather reports, maintaining phone networks or beaming television channels into people’s living rooms.

One however, is orientated in accordance with Lord Darlingtons epithet and faces in the opposite direction, away from the Earth, up into the stars.

This satellite is the Hubble Space Telescope, which, from its vantage point, high above the Earth’s atmosphere, has fundamentally changed the way that we think about the universe.

In 1995, the telescope was used to take a 10-day exposure of a seemingly empty area of deep space. Many doubted the value of this operation, but the resulting image, known as the “Hubble Deep Field”, revealed an abundance of new galaxies, beyond the Milky Way. This exercise was repeated in 2004 with an experiment called “Ultra Deep Field,” in which more than 5,000 new galaxies were discovered, some as far as 13.2 billion light-years away. Since the light from these galaxies has taken this long to reach our solar system, these images offer a window onto what the universe looked like only a short time after the Big Bang, some 13.7 billion years ago.

As well as affording such extraordinary insights into the origins of the universe, the Hubble Space Telescope has also been the source of some of the most spectacular images of the cosmos that our world has ever seen; and these images have profoundly altered the way that we now imagine outer space.

These images are actually created here on Earth by an organisation called the Hubble Heritage Project, a dedicated group of astronomers and image processing specialists, set up by NASA to promote the benefits of its work to a wider public.

Since its formation in 1997, this unit has produced a series of spectacular images of nebulae, galaxies, and other celestial phenomena that have become the standard visual representation of the universe in anything from school textbooks to science fiction movies.

And yet, on closer examination, it would seem that the way in which these images are constructed, says more about the culture that created them, than it does about the phenomena that they appear to portray.

What most people do not realise is that many of the dramatic views that are published by the Hubble Heritage Project bear little resemblance to anything that can ever be seen in reality, anywhere in the universe.

This is because these images are in fact a product of the creative imagination, made up of, on the one hand, data that is not visible to the naked eye, and on the other, the combination of multiple sets of visible data.

For a start, the Hubble’s onboard digital cameras only record grey-scale pixels, and the actual ‘pictures’ that the telescope captures are rather dull, blurred black and white affairs.

The Hubble Heritage Project takes this rather unassuming visual data – as well as many elements, which are not visible to the human, eye – and make their own visual interpretation of this information. Firstly they decide which way is ‘up’ in the picture (in space there is, of course, no ‘up’ or ‘down’) then invisible elements are given physical form and colour and lighting effects are introduced to create highly dramatic forms of composition. In this way, otherwise formless shapes are purposely arranged to look like a fantastical, yet strangely familiar physical landscape.

The resulting images feature a highly distinctive visual language.

Elizabeth Kessler, in her book called ‘Picturing the Cosmos: Hubble Space Telescope Images and the Astronomical Sublime,’ has shown that this distinctive visual language in fact bears a striking resemblance to the work of 19th century Frontier Artists of the American West, like Thomas Moran and Albert Bierstadt.

‘Reference to the familiar visual iconography of 19th century landscapes of the American West threads through the Hubble Heritage Project’s efforts to reach a broad audience. The comparison of Hubble images and Romantic landscapes begins with their shared features, similarities in appearance that link two sets of images made more than a century apart: their color palettes, a focus on small regions within larger objects, dramatic backlighting, towers and pillars, a sense of overwhelming size and scale.’

Kessler who has spent a lot of time studying this imagery and talking with members of the Hubble Heritage Project goes on to say that:

‘Rather than creating something entirely new, astronomers… extended an existing mode of visualization and representation—one associated with exploration—to a new phase of discovery. The mythos of the American frontier functioned as the framework through which a new frontier was seen.’

These days, few would deny the enormous contribution the Hubble telescope has made to the sum total of scientific knowledge, however, to understand why the Hubble Heritage Project, and presumably NASA itself, might want to evoke the spirit of 19th century American Frontier Art in its portrayal of the cosmos, it is important to remember that this was not always the case.

With an original estimated cost to the American taxpayer of around US$400 million, NASA had hoped to launch the Hubble Space Telescope in 1983, but the schedule had slipped as deadline after deadline was missed.

When the Telescope was finally made ready in 1986, NASA was to suffer one of the greatest blows in its history. On 28 January that year, the space shuttle Challenger disintegrated over the Atlantic Ocean, killing its crew of seven. The disaster grounded the shuttle fleet for the best part of three years. Without the shuttle, Hubble could not be launched into orbit.

When, the telescope was finally launched, in April 1990, some seven years late, and billions of dollars over budget, NASA scientists were devastated to discover that its main mirror had been polished to the wrong specifications.

The fault was miniscule, yet it was enough to render the telescope incapable of performing most of the tasks for which it had been designed.

In the months that followed, the telescope and NASA itself became the butt of many jokes, and the project was universally regarded as a white elephant. So, in July, a Newsweek magazine cover branded it as: ‘NASA’S $1.5 BILLION BLUNDER’ and the following year the comedy film, The Naked Gun 2½, depicted the Hubble telescope alongside the Titanic, and the Hindenburg. The name Hubble had become a byword for misjudgment on a spectacular scale.

The laughter continued well into December 1993 when NASA sent up a Space Shuttle servicing mission to attempt to correct the Hubble’s poor vision with what many gleefully referred to as ‘space glasses’.

And then, suddenly, the laughter stopped.

One after another, awe-inspiring images of the furthermost corners of the cosmos now began to emerge from the Hubble Heritage Project team. These spectacular images with names like ‘Mystic Mountain’ and the ‘Twin Pillars of Creation began to redefine how we imagined the universe around us.

With these images, the Hubble Heritage Project was attempting to portray, the world beyond the frontiers of normal human experience, so it is perhaps no surprise that they took their visual language cues from American Frontier artists who, in the 19th century had similarly portrayed the American West as a vast uninhabited, unspoiled Eden, for the benefit of the population back East.

The story of the expansion to the west is a cornerstone of the United States’ national identity. In 1991 an art exhibition held at the Smithsonian American Art Museum in Washington DC called The West as America, Reinterpreting Images of the Frontier, 1820–1920, had ignited a nationwide controversy about American Frontier art.

In the exhibition catalogue, one of the show’s main curators William H. Truettner had written:

‘…images from Christopher Columbus to Kit Carson show the discovery and settlement of the West as a heroic undertaking. Many nineteenth-century artists and the public believed that these images represented a faithful account of civilization moving westward. A more recent approach argues that these images are carefully staged fiction and that their role was to justify the hardship and conflict of nation building. Western scenes extolled progress but rarely noted damaging social and environmental change.’

And yet it is too simplistic to see artists like Bierstadt and Moran, as purveyors of propaganda for the Westward expansion of the United States.

Of course, the images they created clearly had the effect of propaganda… inspiring many people  to migrate West, but this was not necessarily a conscious decision on the part of the artists involved as Elizabeth Kessler, explains  these artists were primarily driven by a Romantic vision of ‘The Sublime’ – as advocated by Immanuel Kant and Edmund Burke – and the majestic landscapes of the 19th century American West provided the perfect opportunity to express this vision. The fact that their work suited the geopolitical agenda was, perhaps, more coincidental than causal.

Similarly, to suggest that the Hubble Heritage Project images were consciously deployed in a bid to repair NASA’s corporate reputation – by adopting the visual language of the art of Manifest Destiny – is possibly a little unfair. Even though it was undeniably useful.

It is perhaps best to think that they had their minds on higher matters. Or as Lord Darlington put it: ‘We are all in the gutter, but some of us are looking at the stars’ 

 

Categories
Uncategorized

The army of toy soldiers that did battle with the Duke of Wellington

According to popular British history, the battle of Waterloo in 1815 was a glorious victory for both Britain and the Duke of Wellington.

In reality, however, victory at Waterloo had been achieved only with the decisive intervention of Prussian Army troops commanded by Field Marshal Gebhard Leberecht von Blücher.

Only a year earlier, a vast alliance of European kingdoms had, after enormous effort, defeated Napoleon Bonaparte and sent him off into exile. But he had escaped and now, having gathered together his old army, and was threatening to do his worst.

The great alliance had to be revived, Bonaparte was declared an outlaw and armies were reassembled all over the continent. By late April, there was little doubt that the first battle of the new war would be fought on France’s northern border with Belgium.

And it was here that the French army under the command of Napoleon Bonaparte was defeated by two of the armies of the Seventh Coalition: an Anglo-led Allied army under the command of the Duke of Wellington, and a Prussian army under the command of Gebhard Leberecht von Blücher.

Information Graphics or “infographics” as they are more often called, owe much of their popularity to the fact that they can make even the most complex subject accessible, by translating large amounts of complex data into more accessible, more readily understandable, visual media.

In the early 19th century, Information Graphics, began to be used as a means of giving complex quantitative information a clear visual narrative. An early example of this approach is Charles Minard’s graphic representation of Napoleon Bonaparte’s disastrous march on Moscow in 1812.

In his “The Visual Display of Quantitative Information”, Edward Tufte declares that it “may well be the best statistical graphic ever drawn”. Because, at a glance, the viewer can see immediately the real, data-based, story of the campaign.

Charles Minard’s graphic representation of Napoleon Bonaparte’s advance on, and retreat from Moscow in 1812

Because they make complex data readily accessible for the casual viewer, visual displays of quantitative information of this sort, are also extraordinarily powerful storytelling devices.

And, because they are more readily understood than the data that they represent, these stories also tend to be believed more readily.

It was this simple fact that eventually led to the downfall of a contemporary of Charles Minard, an Englishman called William Siborne.

William Siborne had set about the task of creating a vast visual representation of The Duke of Wellington’s great victory over Napoleon, at The Battle of Waterloo

Unfortunately, for Siborne, his visual representation directly questioned the Duke of Wellington and the British Establishment’s account of how the battle had been won… and indeed who in fact had actually won the battle.

William Siborne came upon his explosive discovery quite by accident.

In 1829, the British army proposed the creation of a United Services Museum and wanted a scale model of the Battle of Waterloo as its central exhibit. Lieutenant Siborne, a brilliant topographer and military surveyor, then aged 32, received the commission.

Siborne was a meticulous researcher. He spent eight months surveying every inch of the battlefield, and a further eight years identifying every single soldier’s position, at every stage of the battle. He compared accounts from official dispatches, printed memoirs, and by means of pre-printed questionnaires, conducted an unprecedented research programme with surviving veterans from all sides – English, French, and Prussian.

It was the first time in history that a battle had been so carefully researched, and at the end of the process, Siborne had an unrivalled understanding of the position of almost every soldier on the field of battle at every point in the day.

This almost god-like, universal knowledge of the battle, gathered from so many sources, was translated into the scale model that Siborne built.

He conceived this model on a mammoth scale: covering almost 400 square feet, it would perfectly depict every tree, road, and contour of the landscape. Some 75,000 tin models would represent the deployment of the various forces at the moment of the “Crisis of the Battle” at 7 pm on June 18, 1815 – when events turned decisively against the French Emperor.

By this point, the 68,000 British troops with which the Duke of Wellington had started the battle were severely reduced, and his allies – 40,000 Prussians under Field Marshal Blucher von Wahlstadt – were staging their third major attack on the French positions in the village of Plancenoit.

At a time before aerial photography, (or indeed any type of photography at all) this diorama provided the viewer with an unprecedented vision of the battle – one that could be viewed from any angle.

After innumerable practical problems, the model eventually went on show in London, where it was greeted by rapturous reviews and visited by 100,000 people in its first year.

A detail of “The Large Model” 1838, Siborne’s 3D visualisation of the quantitative information that he had gathered from surviving eyewitnesses.

One person, however, failed to share the popular enthusiasm for the model. While Wellington had initially approved of the project as a monument to his military genius, he had become cooler, and eventually downright obstructive, as Siborne’s researches had progressed.

The key point at issue was the role of the Prussians in changing the course of the battle.

After comparing the records of the Prussian General Staff with those of Wellington’s own officers, Siborne had discovered serious inconsistencies in Wellington’s own account of the battle, which had become the official version of events.

Whilst Wellington had always insisted that the Prussians had arrived later in the day, when the battle was already won, Siborne could now prove, beyond any shadow of a doubt, that they had actually joined the battle several hours earlier than Wellington claimed, and consequently had played a far greater part in the victory than was credited to them by Wellington.

However, in questioning Wellington’s version of events, Siborne was not only challenging a pillar of the British Establishment, he was also questioning what had become a central element of national mythology: the conviction that Britain alone – and the genius of the Iron Duke – saved Europe from the tyranny of Napoleon.

Wellington responded by insisting that Siborne was “mistaken” and demanding that most of the Prussian troops displayed on the model be removed.

Siborne knowing that he was correct in his assessment, however, now raised the stakes even further, and published, irrefutable evidence of the decisive Prussian contribution to the victory at Waterloo in two lavish octavo volumes backed up by an accompanying folio volume of highly detailed maps.

A detail from the folio volume of maps that accompanied the two octavo volumes of Siborne’s “History of the War in France and Belgium in 1815”

Siborne paid heavily for this act of defiance.

Having sunk all of his personal wealth into the project, Siborne was shocked to learn that Wellington’s colleagues at the War Office now declined to purchase the model they had commissioned.

And a proposal for the display of the model at the newly opened National Gallery in Trafalgar Square was quietly shelved.

At the same time whilst his own rather academic account of the war had sold relatively poorly, he now had to watch as the information it contained was plagiarised by the Reverend George Gleig, a Wellington-backed rival, for a best-selling book about the campaign that shamelessly corroborated the official version.

Wellington had good reason to campaign so aggressively to discredit both Siborne’s model and his reliability as an historian.

In the early 19th century Britain’s military reputation was still badly tarnished by it’s humiliating defeat in the Americas. And in 1815, Britain’s relations with Prussia were acutely strained by the suspicion that its ally was intent on territorial expansionism. As Hofschröer points out in  “The Duke, the Model Maker and the Secret of Waterloo”,

“had the Duke given the Prussians full credit for their role in the battle, it would probably have led to them making even louder demands for further territorial aggrandisement, upsetting the balance of power so laboriously established at the Congress of Vienna”.

So the Duke had little alternative but to seek to discredit Siborne if Britain’s military reputation – were not to be revealed as being founded on a falsehood.

The Duke won his battle in the short term. And Siborne died in poverty and obscurity in 1851.

However, for once history is not to be written by the victor.

And most historians today would accept Siborne’s evidence that far from being the sole victor at Waterloo, Welington at the very least, should share this honour with Bleucher, without whose help he could quite possibly have been roundly defeated.

What’s more, after almost 200 years , Siborne’s Large Model of Waterloo is once again on public display – in the National Army Museum, in London. Ironically, right beside the Chelsea Hospital, where Siborne ended his days in poverty.

The 4,000 Prussian soldiers that Siborne had so carefully hand-painted, are still missing. But it is an extraordinary testament to a time when an army of model soldiers struck fear in the hearts of the British Establishment.

And to the fact that, in a contest between two competing narratives, the one that has the most persuasive visual representation – especially one based on solid quantitative data – will always be the one that prevails.

Categories
Uncategorized

The truth and the detective

True-Detective-Birds-Portable

Human tribes are bound together by the stories they tell. And these stories reveal much about the hopes and desires, the fears and general preoccupations of those peoples to whom these stories belong.

In the West for the past century or so, the most popular story is that of the detective and their search for the truth.

The first great detective of our time was, of course Sherlock Holmes.

However, Sherlock Holmes is not just the most famous detective of all time, he also represents one of the most potent and compelling ideas in modern popular culture.

With his virtuoso use of empirical observation and deductive logic he is in fact the physical embodiment of ‘The Scientific Method’.

The Scientific Method has dominated the pursuit of knowledge in the West, since the dawn of The Enlightenment and is based on a process of systematic observation, measurement, and experiment, and the formulation, testing, and the constant modification of hypotheses.

And  is a curious symmetry to be found between popular regard for the promise that science offers, and the evolution of detective fiction over the past hundred years or so.

Science and the end of mystery

In the second half of the nineteenth century, it was thought that there was an underlying order to the universe which could be unlocked through mathematics and scientific enquiry. The entire surface of the planet had been mapped and its contents were now being duly itemized, classified and catalogued. Above is an illustration from 'Kunstformen der Natur' (Art Forms of Nature) by Ernst Heinrich Haeckel (1834 –1919) a scientist and follower of Darwin, who discovered, described and named thousands of new species, and also created a grand genealogical tree relating all life forms to one another.  
An illustration from ‘Kunstformen der Natur’ (Art Forms of Nature) by Ernst Heinrich Haeckel (1834 –1919)

Middle and Upper-class Victorians seemed to be at the very pinnacle of human development.

Dominant as they were over a vast working-class majority at home and over millions of “uncivilized” and “lesser” races abroad, it seemed that human history itself, had been nothing if not an inevitable journey away from the dark origins of superstition, savagery and ignorance towards the triumph of the truth of scientific rationalism that was the glory of contemporary Victorian society.

The entire surface of the planet had been mapped – as well as the entirety of the heavens – and its contents were now being duly itemized, classified and catalogued.

And Darwin’s theory of evolution -the very pinnacle of Nineteenth Century scientific thought – seemed to support the inexorable advance of Western culture, legitimizing the great Victorian project of imperial domination of the ‘lesser races’ of the world.

It was thought that there was an underlying order to the universe which could be unlocked through mathematics and scientific enquiry. Indeed the triumph of science was such that it now seemed that everything in the universe was ultimately knowable, and comprehensible to the human mind, and that there was no mystery in the world that could not be penetrated by the persistent and systematic use of the scientific method.

Victorian science however was nothing if not practical, and the Nineteenth century was also an age of unprecedented technological innovation.

Ernst Haeckel and von Miclucho-Maclay 1866
Ernst Haeckel and fellow scientist 1866

London, at the start of the 20th century, was the largest city on the face of the planet and the capital of the greatest empire the world had ever seen. It is estimated that in this period alone, around 10,000,000 square miles of territory and roughly 400 million people were added to the British Empire. At its height the Empire incorporated a quarter of the Earth’s habitable lands, and contained a fifth of its people. And this vast enterprise was made possible by technological innovations like the telegraph, the steam turbine, and the Gatling gun.

A most remarkable invention

It was in Britain in the 1860’s that an exciting new form of literature began to appear. Often referred to as the ‘sensation novel’, or ‘novel with a secret’ these narratives featured a radical new storytelling technique, in which the author would deliberately withhold information from the reader forcing them to use their own powers of observation, logic and deduction to solve a puzzle, a crime that was shrouded in mystery at the heart of the narrative.

These novels were called ‘sensational’ partly because of their content – usually a murder, or better, a murder combined with a sexual transgression –which allowing the reader to experience the dark, mysterious, criminal underbelly of Victorian society from the safety of their favourite armchair.

Scientific rationalism had become a lens through which the darkest, most primitive, and savage actions of less civilized peoples could be viewed, and rectified, by polite society. Clue by clue, discovery by discovery, through the rigorous application of logic and scientific thinking, even the most sinister shadowy mystery could be penetrated and order restored.

In short, the modern detective story, in which the author shares the task of solving the crime with the reader, had been invented.

All detective stories are, by their very nature, about the unequal distribution of knowledge. Not just between the characters within the story, but also between the author and the reader.

Indeed it is only through the careful deployment of false clues, unreliable testimonies, and even barefaced lies that the author gradually allows the truth to emerge and the reader to ultimately share the clarity of their omniscient understanding of the narrative.

And as Patrick Brantlinger succinctly describes it, in his essay What Is “Sensational” About the “Sensation Novel”? from this point on…‘the forthright declarative statements of realistic fiction are, in a sense, now punctuated by question marks.’

Murder most modern

Despite the fact that it is often dismissed as a lesser form of literature, the detective story – or murder mystery as it is also called – has dominated the cultural landscape for over one hundred and fifty years.

21661127_700x700min_1

Indeed, it is hard not to overestimate the impact that the creation of this radical new type of narrative technique was to have on modern popular culture -paving the way, as it did, for the vast array of modern detective stories that surround us today, in the form of books, of course, but also TV shows, movies, stage plays, children’s stories and even board games and computer games. From the child-friendly adventures of Nancy Drew or Enid Blyton’s mystery books, to the dark brooding landscapes of Scandinavian dramas like The Bridge or the Killing, from the reassuring idealised world of Midsomer Murders to the poisoned industrial wasteland of True Detective, and from the elegant and desirable Venetian lifestyle of police commissioner Guido Brunetti to the small town desperation of Ystad’s Inspector Kurt Wallander, the detective story is clearly the dominant dramatic narrative form of the modern age.

The most popular television drama series in the world is CSI: Crime Scene Investigation – the murder mystery franchise produced by Hollywood director and producer, Jerry Bruckheimer. CSI first appeared on TV screens back in 2000, and since then its audience has grown exponentially, so that in 2009, its worldwide audience was estimated at more than seventy million viewers.

Agatha Christie’s Then There Were None, which, with over a hundred million sales to date, and still climbing, is one of the best-selling books of all time, and, according to Publications International, is the 7th best-selling book in the world (outstripped only by such essential publications as The Bible, The Thoughts of Chairman Mao Zedong, The Qur’an, Xinhua Zidian (The Chinese Dictionary), The Book of Mormon, and, of course Harry Potter).

The longest running play in the world is the murder mystery called The Mousetrap – also written by Agatha Christie. – which first opened in the West End of London in 1952, and has been running continuously since then, with its 25,000th performance taking place on 18 November 2012. A feat all the more remarkable given the fact that the audience are asked not to reveal the solution to the mystery after leaving the theatre.

1031-1385376024-mouse1
The longest running play in the world is London’s West End production of The Mousetrap by Agatha Christie.

Scientific certainty: the answer to life the universe and everything

And yet, despite the fact that its legacy dominates so much of contemporary culture, the sensation novel was also very much a product of the certainties of its own time and cultural context.

The Victorian faith in scientific progress was strengthened by the assumption that everything in the universe was ultimately knowable, and that scientific progress would eventually lead to a perfect knowledge of nature’s fundamental physical laws. Indeed for many late Victorians, the completion of this vast god-like understanding of the universe was, only a matter of time.

In a speech at the University of Chicago in 1894 Albert. A. Michelson, the first American physicist to win the Nobel Prize, declared that:

‘The more important fundamental laws and facts of physical science have all been discovered, and these are now so firmly established that the possibility of their ever being supplanted in consequence of new discoveries is exceedingly remote…’

At around the same time Lord Kelvin, the mathematical physicist is reputed to have said that:

‘There is nothing new to be discovered in physics now. All that remains is more and more precise measurement’

And in 1900, the German mathematician, David Hilbert, one of the most influential mathematicians of the 19th and early 20th centuries, put forward a list of the 23 remaining unsolved problems within mathematics at the International Congress of Mathematicians in Paris, (You can read more about Hilbert’s ‘to-do’ list here). According to Hilbert, once answered, these outstanding questions would complete our mathematical understanding of the universe. Indeed by tidying up these last remaining anomalies, we would finally arrive at a universal theory of everything.

What could be simpler? The nature of reality itself, just like the murder mystery novel, appeared to be a puzzle which could be solved, clue by clue, discovery by discovery, through the rigorous application of logic and scientific thinking.

The murder mystery detective just like his real-life counterpart the explorer-scientist, was to become the fictional hero of this great age of scientific discovery. And the greatest of these was none other than the most celebrated fictional detective of all time… Sherlock Holmes.

Sherlock Holmes: the embodiment of the scientific method

Every age has its hero; a figure that, more than any other, embodies the deepest yearnings and greatest aspirations of the time. And for the modern age that figure is none other than the detective.

Sherlock Holmes is not just the most famous detective of all time, but he also represents one of the most potent and compelling ideas in modern popular culture.

With his dazzling use of empirical observation and deductive logic he is the absolute physical embodiment of the ‘scientific method’.

Sherlock-Holmes-Experimenting-Watson-by-Sidney-Paget-1893
‘Sherlock Holmes Experimenting’ by Sidney-Paget 1893. Sherlock Holmes is not just the most famous detective of all time, but he also represents the physical embodiment of the ‘scientific method’. In 2002, Sherlock Holmes became the first fictional character to receive an Honorary Fellowship from the Royal Society of Chemistry, for ‘using science, courage and crystal clear thought processes to achieve his goals.’

The Scientific method has characterized the pursuit of knowledge in the West, for hundreds, if not thousands of years, and consists of a process of systematic observation, measurement, and experiment, and the formulation, testing, and the constant modification of theories.

The widespread adoption of the scientific method in the seventeenth century with its pragmatic, evidence-based approach, stands in sharp contrast to the shadowy world of received wisdom, superstition and taboo that was supposed to have characterized most of pre-enlightenment Europe.

Implicit within the idea of scientific method, is the rather optimistic belief that there is no phenomenon that cannot ultimately be understood… you simply have to do enough observations, measurements, and experiments to reveal the truth.

2013-updated_scientific-method-steps_v6_noheader

It is in just this way then, that the famous “consulting detective” from 221B Baker Street, is allowed to work away, unconstrained by tradition, or petty social norms. His intellect, working like a powerful torchlight that can penetrate the darkest secrets and deepest mysteries of a Victorian London that is permanently shrouded in fog and darkness.

And even though the last Holmes story appeared as late as 1927 – by which time, in reality, the streets of London were filled with motor cars rather than hansom cabs and ablaze with electric lights rather than faint flickering of gas lamps – our hero continued to inhabit his mysterious Victorian world of fogbound streets where, as Vincent Sterrett, in The Private Life of Sherlock Holmes puts it, ‘it is always 1895’.

Of course, the stories of Sherlock Holmes benefit enormously from the fact that it is ‘always 1895’. By creating a character that quite literally embodies the scientific method and placing this character in a setting that represents the complete opposite – a dark, fog-bound shadowy world of prejudice, ignorance and assumption – Conan Doyle created one of the most famous fictional characters the world has ever seen.

In 2002, Sherlock Holmes became the first fictional character to receive an Honorary Fellowship from the Royal Society of Chemistry, for ‘using science, courage and crystal clear thought processes to achieve his goals.’

A tale of two stories: the unique structure of detective fiction

The classic detective story has dominated the popular culture of the English-speaking world for more than a hundred and fifty years now. So is there some dark impulse at the heart of the modern age that creates such an insatiable thirst for savagery and bloodshed? Or is it more about the sense of the civilized world restoring order, clue by clue, revelation by revelation and the satisfaction, the reward, of that wonderful moment, at the end of these stories when all is illuminated, and we finally understand the truth?

The answer is probably a bit of both.

Murder mystery stories tend to share a specific, defining structural characteristic. And to understand this we need to digress for a moment into the world of literary theory. The terms ‘Fabula’ and ‘Sujet’ were originally invented in the 1920s, by two Russian Formalist literary theorists – Vladimir Propp and Viktor Shklovsky, to describe the difference between a story and its plot.

So, for example, if you were to take the movie Citizen Kane, which starts with the death of the main character, and then proceeds to tell his life story through a series of flashbacks that are intercut with a journalist’s investigation of Kane’s life in the present day, the ‘Fabula’ of the film is the story of Kane’s life – the way it happened in chronological order – while the ‘Sujet’ is the way that this story is actually told, reconstructed for greatest dramatic effect through flashbacks.

An even more extreme example of this separation of ‘Fabula’ from ‘Sujet’ is to be found in the structure of the film Memento, where the story is presented as two different sequences of scenes: a series in black-and-white that is shown chronologically, and a series of color sequences shown in reverse order. These two sequences “meet” at the end of the film, producing one common story.

1600px-Memento_Timeline

In The Poetics of Prose, the philosopher and literary theorist Tzvetan Todorov explains how this division between ‘Fabula’ and ‘Sujet’ is uniquely exploited in the narrative structure of the murder mystery story, in that it too:

‘contains not one but two stories: the story of the crime and the story of the investigation. In their purest form, these two stories have no point in common . . .

The first story, that of the crime, ends before the second begins. But what happens to the second? Not much. The characters of the second story, the story of the investigation, do not act, they learn.

…The hundred and fifty pages which separate the discovery of the crime from the revelation of the killer are devoted to a slow apprenticeship: we examine clue after clue, lead after lead…

This second story, the story of the investigation, . . . is often told by a friend of the detective, who explicitly acknowledges that he is writing a book; the second story consists, in fact, in explaining how this very book came to be written . . .

The first— the story of the crime — tells ‘what really happened,’ whereas the second — the story of the investigation — explains ‘how the reader (or the narrator) has come to know about it.'”

The first, that of the crime, is in fact the story of an absence: its characteristic is that it cannot be immediately present in the book…The status of the second story…(is) a story which has no importance in itself, which serves only as a mediator between the reader and the story of the crime…

We are concerned then in the whodunit with two stories of which one is absent but real, the other present but insignificant.”

As Todorov says, ‘The characters of the second story, the story of the investigation, do not act, they learn’ and, of course the most important character in the story of the investigation, the one who does the ‘learning’, is the central figure of the murder mystery genre, the character through whom we the audience learn everything– the detective.

It is through the detective figure that we discover the central mystery-the puzzle that needs solving. And it is through the detective figure that we too speculate and create hypotheses as, bit by bit, parts of the solution to the mystery are gradually revealed.

Until that wonderful moment when all becomes clear, and, together with the detective we experience, for the moment at least, a complete, god-like understanding of everything.

Significantly, the ‘Fabula’ and ‘Sujet’ of a story tend to have very different characteristics.

In classic murder mystery stories, the ‘Fabula’ is primarily about the crime whereas the ‘Sujet’ is primarily about the investigation. So whilst the ‘Fabula’ tends to be about the destruction brought about by the forces of chaos, darkness and barbarism, the ‘Sujet’ on the other hand tends to be about the restoration of order and the all round benefits of civilization. And whilst the ‘Fabula’ is primarily about animal passions, such as love, jealousy, greed and fury, the ‘Sujet’ is primarily about detached observation, intellect and rational scientific thought.

This binary conflict between chaos and order, animal passions and rational scientific thought is also to be found at the very heart of Victorian culture in the second half of the nineteenth century… precisely at the time when the modern detective story first began to appear.

The origins of the scientific detective

The detective story, as a distinct genre, was a product of Victorian culture however, and only a tiny proportion of the detective fiction produced at the time is still available today, even less is still read for pleasure or even studied by academics. There are therefore many theories therefore about the precise genealogy of this form of narrative.

The prototype for the scientific detective – later made famous by Sherlock Holmes – was certainly Chevalier Dupin, a character created by that master of tales of mystery and the macabre, Edgar Alan Poe. Chevalier Dupin, appears in three stories: The Murders in the Rue Morgue (1841); The Mystery of Marie Rogêt (1842); and The Purloined Letter (1845).

As T.J Binyon puts it in “Detective in Fiction from Poe to the Present“:

“In Dupin, Poe created the prototype of the great detective, the eccentric genius with stupendous reasoning powers, whose brilliance is given added refulgence by the fact that he is always accompanied, and his investigative tours de force always set down, by a loyal admiring, but uncomprehending and imperceptive friend and assistant.”

Poe himself was aware of how innovative these stories were – “These tales of ratiocination,” Edgar Allan Poe explained in 1846, “owe most of their popularity to being something in a new key.” – and even in outline, the modern reader will recognize many of the features of the classic detective story. In The Murders in the Rue Morgue, Dupin comes across the case in the newspaper of the gruesome murder of Madame L’Espanaye and her daughter in their apparently locked lodgings in the Rue Morgue, and using his powers of logical deduction he unravels the seemingly insoluble mystery, by methodically sifting through all the various accounts and considering various all the possible hypotheses, he exposes the narrow-mindedness of the local prefect of police. Poe had given the form its initial shape, created its first great detective, complete with companion/narrator.

It was almost half a century later, in 1887, with a short story called A Study in Scarlet, Sir Arthur Conan Doyle, introduced his own version of this intrepid pair – the eccentric genius with extraordinary deductive powers, accompanied by his trusty, but rather imperceptive assistant – in the shape of Sherlock Holmes and Dr. John H. Watson. This duo, were, of course, to become perhaps the most famous duo in literary history, going on to become the subjects of four full length novels and 56 short stories, all but four of which are narrated by Holmes’s assistant, Dr. Watson.

moonstone final

However the author who was responsible for creating so many of the elements that would later become the model for the classic detective stories of the early twentieth century was a writer of ‘sensation’ novels called Wilkie Collins, with the publication of what many consider the first, and greatest example of the genre: The Moonstone in 1868.

The renowned crime writer Dorothy L. Sayers once described The Moonstone as ‘probably the very finest detective story ever written’.  Whilst the distinguished modernist poet and literary critic T.S. Eliot called it “the first, the longest, and the best of modern English detective novels in a genre invented by Collins and not by Poe.”

Whoever it was that was ultimately responsible for creating this new literary genre – and in reality there were probably many writers who should ultimately take the credit for their own individual contributions – it was the publication of The Moonstone in 1868 that clearly laid down the principles for all the ‘Golden Age’ murder mystery stories that were to follow. These principles can be summarized in the following manner:

  1. The story must form a puzzle, the solution to which is only revealed at the very end
  2. An investigator with unusual forensic and deductive skills who is seeking to establish ‘the truth’ must drive the plot.
  3. And this investigator is placed in direct contrast to the conventional police who are largely incompetent.
  4. It is essential that the reader discovers clues only as the investigator does
  5. There should be a large array of false suspects, and false clues, to confuse and mislead the reader
  6. The guilty party should always be the last person you might suspect
  7. The crime to be solved should be a ‘locked room mystery’ in which a murder has been committed under apparently impossible circumstances

The Golden Age of Detective Fiction

In the early part of the Twentieth Century, this genre exploded into mainstream popularity – especially amongst the British Middle classes. According to Carole Kismaric and Marvi Heiferman in The Mysterious Case of Nancy Drew & The Hardy Boys:

‘The golden age of detective fiction began with high-class amateur detectives sniffing out murderers lurking in rose gardens, down country lanes, and in picturesque villages. Many conventions of the detective-fiction genre evolved in this era, as numerous writers — from populist entertainers to respected poets — tried their hands at mystery stories.’

And Professor William D. Rubinstein, in his essay called A Very British Crime Wave: How Detective Stories Captured the Imaginations of the British Middle Classes in the 20th Century gives some sense of the scale of this phenomenon:

‘Between around 1910 and 1950 Britain was in the grip of a genteel crime wave; a seemingly endless output of murder mysteries… Such works formed a major component of middle-class culture in Britain at the time: for every person who read T.S. Eliot, D.H. Lawrence, or Virginia Woolf, probably 50 more read Agatha Christie and double that number Conan Doyle.’

aGATHA BOOKS

It was during this time, that some of the greatest works of the English murder mystery genre were created. The Grande Dame of the genre, Agatha Christie, was publishing classics with titles such as, Murder on the Orient Express (1934), Death on the Nile (1937), and Then There Were None (1939) as well as introducing her two most famous fictional detectives, Hercule Poirot and Miss Marple. At the same time, Dorothy L. Sayers’s was busy creating her archetypal British gentleman detective, Lord Peter Wimsey with equally classic murder mysteries like: Murder Must Advertise (1933) The Nine Tailors (1934) and Gaudy Night (1935).

But whilst this was a time of great creativity, it was also the period during which conventions of the genre came to be set in stone, and in 1929 the British clergyman, and amateur detective storywriter Ronald Knox set out his ‘Decalogue’ of rules for detective fiction.

These rules, which ensure that, a detective story “must have as its main interest the unraveling of a mystery; a mystery whose elements are clearly presented to the reader at an early stage in the proceedings, and whose nature is such as to arouse curiosity, a curiosity which is gratified at the end” were perhaps written with slightly more of a wry smile than is generally thought – Father Knox was also known for his love of pranks and practical jokes – but they are rules which most writers within the genre adhere to.

Father Knox’s Decalogue: The Ten Rules of Detective Fiction

  1. The criminal must be someone mentioned in the early part of the story, but must not be anyone whose thoughts the reader has been allowed to follow.
  2. All supernatural or preternatural agencies are ruled out as a matter of course.
  3. Not more than one secret room or passage is allowable.
  4. No hitherto undiscovered poisons may be used, nor any appliance which will need a long scientific explanation at the end.
  5. No Chinaman must figure in the story.
  6. No accident must ever help the detective, nor must he ever have an unaccountable intuition which proves to be right.
  7. The detective must not himself commit the crime.
  8. The detective must not light on any clues which are not instantly produced for the inspection of the reader.
  9. The stupid friend of the detective, the Watson, must not conceal any thoughts which pass through his mind; his intelligence must be slightly, but very slightly, below that of the average reader.
  10. Twin brothers, and doubles generally, must not appear unless we have been duly prepared for them.

In her wonderfully insightful and informative Twentieth-Century Crime Fiction Lee Horsley explains why so unashamedly artificial a genre has had such enduring popular appeal:

‘One answer might be that this reassuring object (a well-known kind of text) is also an invitation to playful readers to participate, challenging them to put a fictional world in order by the act of being, simply, a ‘good reader’. Such a person will judge the writer as ‘good’ partly because he or she manages to delay an appreciative audience’s recognition of the ‘true’ narrative. Seen in this way, the classic detective story combines the comforting familiarity of a repeated pattern with the surprising turns of a well-played game.’

Lee Horsley expands upon this idea by comparing the rise in popularity of golden age crime fiction with the increasing popularity of the cryptic crossword:

‘In the inter-war period, the flourishing of golden age crime fiction, epitomized by Christie and Sayers, coincided with the emergence of another favourite British game, the cryptic crossword. The cryptic crossword, it should be emphasized, differs from the variety of word puzzle which is solved by filling in famous names, general knowledge, or words that fit dictionary definitions. Rather, it consists of complicated wordplay (puns, anagrams, etc.), with completion of the puzzle involving a battle of wits between clue-setter and solver.5 Although the early cryptic crossword was somewhat anarchic, it soon became established that all good setters must abide by the fundamental principle of fair play. One of the best-known setters, Afrit, offered the dictum, ‘I need not mean what I say, but I must say what I mean.’ This would, I think, serve equally for any good detective storywriter. Both games mirror the nature of civilized discourse in their careful ironies, their nuances and clever evasions, and their attentiveness to the exact meanings of words (a particular skill, for example, of Christie’s Hercule Poirot). The correct answer should be accessible to the solver, but must be cleverly hidden, in such a manner that, once enlightened, he or she will ‘see that the solution had, in a sense, been staring him in the face’.

Colonel Mustard and the Murder at Tudor Hall

In the 1930’s, the game-like quality of English Golden Age detective fiction, had inspired a craze for ‘Murder Mystery’ games, played out in hotels across the length and breadth of the country.  These ‘Murder Mystery’ games, would involve both actors and hotel guests playing the part of characters in a murder mystery drama – one that centred around the ‘murder’ of one of the guests. The hotel, with its large number, of sprawling rooms, took on the role of the country mansion and when the body was found murdered, each of the guests would fall under suspicion. By piecing together the clues provided, the hotel guests would then have to solve the mystery over the course of the evening.

Anthony E. Pratt was a musician who made a living from playing piano in the country hotels where these murder games were played. Anthony himself was a huge fan of murder mystery stories and in particular Agatha Christie novels like The Body in the Library.  And as he watched these ‘Murder Mystery’ games being played out in front of him night after night, he began to have a rather brilliant idea…

Anthony and Elva Pratt in the 1940's around the time they devised the game of Cluedo
Anthony and Elva Pratt in the 1940’s in their back garden around the time they were in the process of designing the game of Cluedo

Anthony Pratt realized that he could translate these murder mystery games into a board game. By 1943, Anthony with some help from his wife, Elva, had designed his murder mystery board game. The game was called “Murder” and Elva designed the artwork for the board. The object of the game was for each player to move around the game board, which featured the floor plan of an English country house, known as Tudor Hall, with each player in the guise of one of the game’s six characters. As they did so they would collect clues until they were able to announce who had committed the murder, in which room, and with which weapon. So for example the winner might utter the immortal words: ‘I suggest it was Colonel Mustard, in the Library, with the Lead Pipe.’

Anthony Pratt filed his original patent application on 1 December 1944, and in February 1945 showed the game to Waddington’s, the largest British board games manufacturer at the time. Waddington’s immediately saw the potential and, with a few very minor modifications, (It was Waddington’s who renamed the game Cluedo – a combination of “Clue” and “Ludo”, the Latin word for “I play”) decided to go ahead and manufacture the game.

Of course, Cluedo – or Clue as it became known in the United States – went on to become a massive worldwide success. It is fitting, therefore, that one of the most enduring expressions of the English Golden Age of detective fiction is not a book, but a board game, and one with a very specific social setting. Because – in the same way that for Sherlock Holmes it is ‘always 1895’ – for players of Cluedo, it will always be 1926 at a country mansion somewhere in Hampshire.

Cluedo-01
The Cluedo board featuring the floor plan of the English country house, known as Tudor Hall desigfned by Elva Pratt in 1943.

The end of the idea of ‘Human Progress’

Over the course of it’s first hundred years in Britain, detective fiction had developed many conventions that were in danger of fossilising this otherwise vibrant new artform and lending the genre a somewhat artificial quality.

This was all the more noticeable given the social and cultural changes that had taken place in Britain in the meantime.

During the course of the intervening years the old Victorian attitudes to progress began to change. For many writers, artists and poets it had been the horrors of the First World War that gave the lie to the simplistic nature of the late Victorian worldview. For others it was the increasing familiarity with other cultures and worldviews that occurred as Imperial power declined and new forms of media proliferated.

tumblr_mwg9x8dKSF1qz4txfo1_1280-1

Whatever the precise cause, we can see that some kind of tipping point in the understanding of the nature of progress was reached in 1931, when the British historian Herbert Butterfield published a short, but highly influential book, called The Whig Interpretation of History. The title referred to the Eighteenth century conflict between ‘Whigs’ and ‘Tories’ in which, the ‘Whigs’ – who were political liberals -believed in the concept of progress and the ‘Tories’ – who were political conservatives – distrusted anything new, and clung to the institutions of the past.

Butterfield’s little book demonstrated the major flaw in the Victorian representation of history; imagining as it did, the past as an inevitable progression towards ever greater liberty and enlightenment, culminating in the finished forms of liberal democracy and constitutional monarchy that we have today. Butterfield’s thinking was rapidly and widely adopted in academic circles. Indeed after this, anyone in academic circles offering a view of the world based on the Victorian notion of progress was in danger of being dismissed as ‘Whiggish’ in their approach.

And yet, Whiggish interpretations of history continued to influence the popular imagination, and to this very day can be found throughout popular culture in the form of films, television documentaries, and even history textbooks. This can be most dramatically seen in the story of the famous scientific illustration, commonly referred to as The March of Progress.

The_March_of_Progress
The The March of Progress’ is not only one of the most famous scientific illustrations ever created, it is also one of the most misunderstood.

The illustration was originally commissioned for Time-Life Books in 1965, by anthropologist F. Clark Howell and painted by natural history painter Rudolph Zallinger, and it was designed to show a visual summary of 25 million years of human evolution, by lining up a series of figures from the history of human evolution marching in line, from left to right,

It was never the authors’ intention to imply a linear ancestor-descendant parade, but as the popularity of the image grew, the image became known as The March of Progress, reinforcing as it did so, the old discredited Victorian idea of progress: that early human evolution had developed in a linear, sequential fashion along a predetermined path towards our current ‘finished’ form as human beings.

Shocked by the scale of the popular misreading of the image, Howell later remarked that ‘the artist didn’t intend to reduce the evolution of man to a linear sequence, but it was read that way by viewers… The graphic overwhelmed the text. It was so powerful and emotional’.

The end of the dream of human omniscience

However it wasn’t just in the realm of popular culture that the Victorian concept of progress persisted. Watching these Victorian ideas – like the idea of progress and the dream of an all embracing ‘Theory of Everything’ – gradually evaporate during the twentieth century, is a bit like watching the tide go out on a very shallow shelving beach: in some areas they disappeared very quickly whilst in others they lingered in large but increasingly isolated pools.

Ironically perhaps, the last remaining areas left behind by the receding tide were in the field of science. Twentieth century science needed the idea of progress since the idea that science is a body of knowledge passed down from one generation to the next, was, for many scientists, a fundamental article of faith… a prerequisite to the idea of science itself.

But as science developed in the twentieth century, it began to become clear that far from steadily unraveling the nature of the universe, many of these new discoveries were, in fact, simply opening up more questions.

As late as 1930 David Hilbert, (the mathematician who had issued the original challenge to solve the last 23 remaining unsolved mathematics problems at the International Congress of Mathematicians in Paris in 1900) was once again attacking the idea that there were any limits to scientific knowledge. A position exemplified by the phrase ignoramus et ignorabimus, meaning “we do not know and will not know”.  In a celebrated address to the Society of German Scientists and Physicians, in Königsberg he once again stated:

We must not believe those, who today, with philosophical bearing and deliberative tone, prophesy the fall of culture and accept the ignorabimus. For us there is no ignorabimus, and in my opinion none whatever in natural science. In opposition to the foolish ignorabimus our slogan shall be: Wir müssen wissen — wir werden wissen! (‘We must know — we will know!)

Less than a year later, in 1931, a 25 year old mathematician called Kurt Gödel, who had actually been at this lecture, demonstrated that Hilbert’s ambitious grand plan to tidy up the remaining questions and anomalies within mathematics was, in fact, impossible. In what came to be known as Gödel’s Incompleteness Theorems, the young mathematician showed that mathematics could either be consistent or complete. And that it could never be both.

Hilbert never published again, and never recognised Gödel’s work. The words ‘Wir müssen wissen. Wir werden wissen’ are the only words on Hilbert’s gravestone.

Hilbert's grave with the simple words: Wir müssen wissen. Wir werden wissen.
Hilbert’s grave with the simple words:
Wir müssen wissen. Wir werden wissen.

But perhaps the greatest barrier to the discovery of a grand, all embracing ‘Theory of Everything’ was the fact that the two great achievements of twentieth century science, Albert Einstein’s two theories about the nature of the universe – the Theory of General Relativity and the Theory of Quantum Mechanics – had been shown to be incompatible.

After years of research, and experimentation, physicists in the 1950’s had confirmed virtually every prediction made by his theory of the very large – the Theory of General Relativity, which focuses on how gravity affects the way the universe, behaves in terms of large-scale and high mass objects like stars and galaxies – and his theory of the very small – the Theory of Quantum Mechanics, which focuses on the way the universe behaves in terms of objects with both small scale and low mass: objects like sub-atomic particles, atoms and molecules, etc.- within their own domains.

The only problem was that these physicists had also shown that General Relativity and Quantum Mechanics, as they are currently formulated, are mutually incompatible. In short, that two of the greatest scientific breakthroughs of the twentieth century, the Theory of General Relativity and the Theory of Quantum Mechanics, cannot both be right.

One answer to these seemingly intractable questions came in 1962, with the publication of The Structure of Scientific Revolutions by the physicist Thomas Kuhn.

The Structure of Scientific Revolutions changed the way that many scientists think about science, and triggered an ongoing assessment of what scientific progress really means… the effects of which is still being felt to this very day.

According to The Stanford Encyclopedia of Philosophy:

Thomas Samuel Kuhn (1922–1996) is one of the most influential philosophers of science of the twentieth century, perhaps the most influential. His 1962 book The Structure of Scientific Revolutions is one of the most cited academic books of all time. Kuhn’s contribution to the philosophy of science marked not only a break with several key positivist doctrines, but also inaugurated a new style of philosophy of science that brought it closer to the history of science.

Even though the majority of people have never heard of The Structure of Scientific Revolution – or its author – their thinking has still been profoundly influenced by his ideas. Indeed, the term ‘paradigm shift’, first coined by Kuhn, to define one of the central ideas of this ground-breaking work, has become one of the most used, and abused, phrases in modern English.

Kuhn’s great achievement was to, at a stroke, change the way we think about mankind’s attempt to understand the world through science.

As The Stanford Encyclopedia of Philosophy puts it, before Kuhn, our view of science had been dominated by a narrative of scientific progress as ‘the addition of new truths to the stock of old truths, or the increasing approximation of theories to the truth, and in the odd case, the correction of past errors’. In other words, we had seen science, as providing an inevitable and heroic progression towards the ultimate “truth”. A progression in which each successive generation of scientists built on the discoveries and knowledge by ‘standing on the shoulders’ of previous generations.

However, rather than science being this steady, cumulative “progress”, Kuhn saw the history of science as a series of revolutions within which conflicting paradigms overthrew one another. A paradigm is never overthrown until a replacement paradigm is waiting in the wings, and crucially this new paradigm is not necessarily any more ‘truthful’ than the one that it replaces.

According to Kuhn, one of the aims of science is to find models that will account for as many observations as possible within a coherent framework. So, for example, taken together, Galileo’s re-evaluation of the nature of motion and Keplerian cosmology represented a coherent framework that was capable of rivaling and replacing the Aristotelian/Ptolemaic framework.

Once a paradigm shift like this has taken place, the textbooks are rewritten. (And often, at this stage, the history of science is also rewritten, and presented as an inevitable process leading up to the newly established framework of thought). At this point in the establishment of a paradigm, there is a widely held belief that all hitherto-unexplained phenomena will, in due course, be accounted for in terms of this newly established framework of knowledge.

Controversially, Kuhn suggested that those scientists, who choose to operate within an established paradigm, spend their lives in the process of mere ‘puzzle-solving’, since the initial successes created by the established paradigm tend to generate the belief that the paradigm both predicts and guarantees that solutions to these puzzles exist.

Kuhn calls this ‘puzzle-solving’ process ‘Normal Science’. However, this ‘Normal Science’ begins to have problems, as the paradigm is stretched to its limits, and anomalies — i.e. failures of the current paradigm to take into account newly observed phenomena — begin to accumulate. Some of these anomalies may be dismissed as errors in observation, whilst others may merely mean making a few minor adjustments to the prevailing paradigm. But no matter how many anomalies are found, the scientific community as a whole, will not lose faith in the established paradigm until a credible alternative is available.

And yet, Kuhn maintained, that in any community of scientists, there would always be some individuals who would embrace these anomalies, and who would begin to practice what Thomas Kuhn calls ‘Revolutionary Science’… exploring alternatives to the long-held assumptions of the prevailing paradigm. Occasionally this ‘Revolutionary Science’ will create a new paradigm, a rival to the established framework of thought. And in time, if the majority of the scientific community adopts this challenger paradigm, it will completely replace the old paradigm, and a ‘paradigm shift’ will have occurred.

In this way, Kuhn argued that competing paradigms are “incommensurable”: that is to say, there exists no objective way of assessing their relative merits.

In other words, there is no one single, objective,’ Theory of Everything’ waiting to be discovered by modern science, simply the process of ‘puzzle solving’ within the prevailing paradigm which is simply the current shared belief system of the cultural community of scientists… until a new ‘paradigm shift’ takes place.

Obviously many scientists were, and still are, scandalized by the suggestion that modern science is a culturally constructed narrative, rather than the progression towards some kind of universal truth.

But perhaps, and more importantly, even though the majority of people have never heard of either Thomas S. Kuhn or The Structure of Scientific Revolutions, we have all unconsciously adopted his thinking… including the idea of ‘paradigms’ and ‘paradigm shift’. And, unlike our Victorian forebears, we are now willing to believe that reality can be viewed from a number of cultural perspectives. Each of which is equally valid.

The American Detective

British Golden Age crime fiction is often referred to as ‘Cozy’ crime fiction, since it tends to be set in an idealized version of middle or upper class England… a reassuringly ordered world that – although temporarily disturbed by the nuisance of an unsolved crime – is always restored to its natural state of peace and harmony in the end.

The act of solving the crime in a ‘Cozy’ is therefore doubly satisfying, since it represents both, an intellectual accomplishment, as well as an act, which restores order and balance to the world. Furthermore, it is one in which the reader is an active participant in the ‘suet’ of the narrative – the scientific search for the truth – and a detached observer of the ‘fabula’ – the dark story of the crime itself.

The ‘Cozy’ was the product of a different age, an age of scientific and social certainties, and, as the Twentieth century developed, through world war and economic depression, many of these certainties began to unravel, and this form of murder mystery began to look increasingly anachronistic and unrealistic.

And nowhere in the world did Golden Age British detective fiction look more artificial and anachronistic than in the United Staes during the great depression.

Film Noir Comp-Recovered

Taking the essential structure of the detective story, writers like Dashiel Hammett and Raymond Chandler, were to fashion a new, more contemporary, kind of fiction that was to come to be known as ‘Hardboiled’.

‘Hardboiled’ detective fiction developed directly out of the American world of pulp detective magazines like ‘True Detective’, the pioneering American crime magazine specialized in dramatizing real-life American crime stories.

These pulp detective magazines had reached the peak of their popularity in the 1920s and 1930s at a time when Prohibition was turning ordinary citizens into criminals and ordinary criminals into celebrities. At this time, magazines like True Detective had become so popular – some would sell up to one million copies per issue – that real life cops and robbers vied to see themselves on the pages. Even FBI boss, J. Edgar Hoover himself, found time to write regularly for the pulp detective magazines.

The ‘Hardboiled’ world lacks the comforting certainty of the British ‘Cozy’, embedded as it is in the reality of crime and violence. And in contrast to the ‘Cozy’ tradition – where deeds of a sexual and violent nature often feature as part of the fabric of the ‘fibula’, but rarely as part of the ‘sujet’ – the ‘hardboiled’ world is hostile, dangerous and morally ambiguous – both in the fabula and the sujet. Furthermore, unlike their “Cozy” British counterparts, these detectives solve mysteries, by moving freely within the world of those who commit the crimes and not just by observing them scientifically from a distance.

The first and, perhaps, the most famous example of the ‘hardboiled’ detective is a character created by Dashiel Hammett called Sam Spade. As Hammett himself later described him:

Spade has no original. He is a dream man in the sense that he is what most of the private detectives I worked with would like to have been and in their cockier moments thought they approached. For your private detective does not — or did not ten years ago when he was my colleague — want to be an erudite solver of riddles in the Sherlock Holmes manner; he wants to be a hard and shifty fellow, able to take care of himself in any situation, able to get the best of anybody he comes in contact with, whether criminal, innocent by-stander or client.

A composite of many of Hammett’s previous detectives, Sam Spade was to become the prototype for a vast number of cynical, world-weary hard-boiled detectives. His is a fictional character that casts a very long shadow. His influence can be felt in characters as diverse as the retired police officer, Rick Deckard, in the movie Blade Runner, Raymond Chandler’s Philip Marlowe, Henning Mankell’s Kurt Wallander or private investigator J.J. “Jake” Gittes in the movie Chinatown.

2.-The-Maltese-Falcon-1941

Although Spade first appeared in 1930, in a story called The Maltese Falcon, serialized in a pulp magazine called The Black Mask, it was not the American pulp magazines that were destined to make the hard-boiled detective famous. It was the movies.

And although the 1931 movie version of the Maltese Falcon had been a modest commercial and critical success, it was the great 1941 remake, directed by a young first timer called John Houston and featuring an unknown 42 year old actor called Humphrey Bogart as Sam Spade, that created the archetype of the modern hard boiled detective, in a new, highly expressive film style that was to become known as ‘Film Noir’. Indeed, John Huston’s Maltese Falcon is widely regarded as the greatest detective movie of all time.

Into the darkness with Film Noir

Ever since the Wall Street crash in 1929, America had been in the grips of The Great Depression. But for the big five Hollywood Studios (MGM, Paramount, Fox, RKO, and Warner Bros.) The Great Depression was a veritable boom time. Going to the movies was one way for people to escape from it all – at least for a while – and by 1939 the number of movie theaters in the United States had grown to over fifteen thousand.

The 1930s had seen amazing technical advances in both Technicolor and sound, as evidenced by epic movies like The Wizard of Oz and Gone with the Wind. But these films were unbelievably expensive to make because the color technology they employed was still in its infancy and the three-strip color process used in the production of these films required massive amounts of lighting and time to create.

To maximize their investment in these expensive blockbuster spectacles, the studios used a system called “Block Booking”. This meant that for cinemas to get the rights to showing the big A-list movies, they would have to buy blocks of films which included an assortment of less desirable B-list movies. At the height studio era, these blocks could include up to a hundred films a year purchased in advance blindly by the theaters, often before they even went into production.

Because of this need for large volumes of low cost B-list movies, there was a massive demand for stories, and many of these were found in the pulp fiction of the time, which featured western, sci-fi, horror stories, and of course, the new ‘Hardboiled’ detective stories.

Given the fact that these were low budget movies and because their financial success was relatively assured, a certain amount of experimentation was allowed in how these stories were told. Directors like the German immigrant Fritz Lang – who had been at the forefront of German Expressionist cinema with its highly stylized set design, use of unusual camera angles and dramatic lighting – were quick to seize the opportunity to create movies in a more expressive style, a style that came to be known as ‘film noir’.

The primary literary influence on film noir was the Hardboiled School of writers such as Dashiell Hammett and James M. Cain – both of whom had written for The Black Mask. The classic film noirs The Maltese Falcon and The Glass Key (1942) were based on novels by Hammet, whilst Cain’s novels provided the basis for Double Indemnity (1944), Mildred Pierce (1945), The Postman Always Rings Twice (1946), and Slightly Scarlet (1956). However it was Raymond Chandler – another Black Mask writer – who soon became the most famous author of the hardboiled school. Not only were Chandler’s novels turned into major noirs—Murder, My Sweet (1944), The Big Sleep (1946), and Lady in the Lake (1947) but Chandler was also to become an important screenwriter in the genre as well, producing the scripts for Double Indemnity, The Blue Dahlia (1946), and Strangers on a Train (1951).

Film Noir Comp2

But it is the visual expression of ‘Noir’ movies that is perhaps their most striking feature. It was the French film critic Nino Frank that first coined the phrase ‘film noir’ in 1946. And, as the name suggests this is a form of cinema where darkness dominates the screen.

And within these enormous mysterious shadow areas we sense the realm of the unknown. Because in film noir, it is not what you can see, but what you cannot see, that sets the tone of the drama.

It is this deep and unmistakably modern truth that the American painter Edward Hopper articulates so beautifully in his painting Nighthawks, a painting that was heavily influenced by film lighting; indeed it could very easily be a still from just such a movie. Indeed, in turn, Nighthawks has gone on to be used as a reference for the lighting design for countless film noir movies.

2560px-Nighthawks_by_Edward_Hopper_1942
ABOVE: “Study for Nighthawks” by Edward Hopper, 1941 or 1942, fabricated chalk and charcoal on paper, 28.3 × 38.1 cm (11 1/8 × 15 in.). BELOW: “Nighthawks” by Edward Hopper 1942, Oil on canvas 84.1 cm × 152.4 cm (33 1⁄8 in × 60 in)

Much has been made of the relationships between the three characters seated at the bar. Earlier in his career, Hopper had earned his living creating the cover art for pulp detective magazines, and in the process he had taught himself the ability to condense the suggestion of a large complex narrative into a single image. According to his biographer, Gail Levin, the painting was inspired after reading Ernest Hemingway’s story The Killers, in which two hit men arrive at a diner to murder a burnt-out prizefighter for some undisclosed offence. And yet, we do not really need any backstory to read this painting, whatever about the relationship between the characters inside the diner, the real drama in this painting is in the significance of the relationship between the light of the interior and the darkness of the exterior. Indeed, the darkness outside the diner is the real point of Nighthawks.

Inside it is safe, and there is certainty. But step outside, and nothing is certain… for who knows what danger lurks in those shadows? In these mysterious shadow areas we sense the realm of the unknown. And we know it is not what you can see, but the sheer enormity of what you cannot see, that will ultimately decide the fate of the characters taking temporary refuge in the illuminated interior.

Forget it Jake. It’s Chinatown.

It has been said that the detective novel ‘brings light into dark places, and, in doing so, for a brief period at any rate, it washes the world clean’. And yet, with the advent of American Film Noir, it is clear that we have journeyed a great distance from the ‘Cozy’ world of Lord Peter Wimsey and Hercule Poirot, from stories set in an idealized upper class England… a reassuringly ordered world that, although temporarily discomoded by the presence of an unsolved crime – is always restored to its natural state of peace and harmony in the end.

And this is partly, because the world had changed. We no longer believed in one ‘authorised’ version of  the truth, we are less inclined to believe that we can know everything, and we no longer have a simple blind faith in the Victorian notion of progress.

Indeed, the ”Hardboiled’ detective is like one of Thomas S. Kuhn’s renegade scientists, who finding anomalies in the prevailing paradigm is forced to construct their own belief system outside of conventional society.

And although, the essential structure of  the ‘Sensation Novel’ is still present – in terms of the audience going on a journey of discovery with the detective – the tone and manner has changed considerably. Here sex and violence  feature both as part of the fabric of the ‘fabula’, as well as the ‘sujet’ , indeed, the ‘Hardboiled’protagonist is often as dangerous and morally ambiguous, as the world in which he now thoroughly immerses himself in. And perhaps more importantly, when the mystery is revealed and the crime solved, order is no longer necessarily restored, and evil is not necessarily removed from the world.

Nowhere is this more true than with Roman Polanski’s 1974 movie, Chinatown.

With the exception of John Huston’s The Maltese Falcon, Roman Polanski’s Chinatown (1974) is often considered to be the greatest detective movie ever made. The original screenplay was written by Robert Towne in the style of the classic 30’s and 40’s Film Noirs of Dashiell Hammett and Raymond Chandler, and features a protagonist who is clearly based on the likes of Sam Spade and Phillip Marlowe… a private eye called J.J. “Jake” Gittes, a part that  Robert Towne had written specifically for actor Jack Nicholson.

Robert Towne’s screenplay for the film has become a model for other writers and filmmakers, and is often cited as one of the finest examples of the craft. However, it was the director, Roman Polanski who decided about filming the fatal final scene, changing Towne’s idea of a happy ending, and thus transforming what might have been a good period detective story into one of the greatest movies of all time. “I knew that if Chinatown was to be special,” Polanski later said, “not just another thriller where the good guys triumph in the final reel, Evelyn had to die.”

Towne had worked on many high profile movies such as Bonnie and Clyde (1967), The Godfather (1972) The Last Detail (1973) and in 1971 producer Robert Evans had offered Towne $175,000 to write a screenplay for The Great Gatsby (1974), Towne felt he could not improve on the F. Scott Fitzgerald novel and  instead, asked Evans for a mere $25,000 to write the screenplay for his own, original story, Chinatown.

Jake-Gittes-in-Chinatown-640x428
“Forget it, Jake. It’s Chinatown.” Jack Nicholson as  J.J. “Jake” Gittes in the movie Chinatown.

Set in 1937 Chinatown descibes the manipulation of a critical natural resources, by a shadowy cadre of city oligarchs. Chinatown was to be the first part of a planned trilogy featuring J.J. Gittes, as he investigates the supression of public interest by private greed through the manipulation of natural resources in this case water. (The second and third part were to deal with the city oligarchs appropriation of oil and land.) Although the story is set in 1937, Chinatown is based on real-life events in Los Angeles that became known as the Owens River Valley Scandal and actually took place in 1908.

J.J. “Jake” Gittes, a low-rent divorce detective, is hired to follow the LA water commissioner, by his wife who claims he’s cheating on her, at a time when Los Angeles is suffering from severe water shortages. We follow him, as he gradually uncovers secrets that reveal layer upon layer of corruption and deception. The wickedness revealed is staggering, as we’re slowly subjected to the corruption of politics, money, sex, innocence and even the land itself.

And although the central crime in the story is institutionalised patriarchal rape, as Margaret Leslie Davis, says in her 1993 book Rivers in the Desert: William Mulholland and the Inventing of Los Angeles, this sexually charged film is a metaphor for the “rape” of the Owens Valley.

When Jake finally solves the mystery, he is incapable of righting the wrongs he discovers. In the end, there is just a sense of futility and powerlessness, in the face of such absolutely corrupt and unassailable power.

As his partner says: ‘Forget it, Jake. It’s Chinatown.’

Dark Matter

Dark Matter copy_Page_4

As the twentieth century drew to a close Western science began to become more and more aware of just how much it did not know. Indeed, in a complete reversal of Victorian thinking which saw everything as knowable and a complete Theory of Everything being just over the horizon, scientists now began to realise that the vast majority of the universe is made up of stuff that we cannot see, detect or even comprehend.

The first inkling scientists had that there might be more mass in the universe than was previously realised came in the 1970s, when Vera Rubin, a young astronomer at the Department of Terrestrial Magnetism at the Carnegie Institution of Washington, began observing the speeds of stars at various positions in their galaxies. Traditional Newtonian physics predicts that stars on the outskirts of a galaxy should orbit more slowly than stars at the center, and yet Rubin’s observations found that all the stars in a galaxy seem to circle the center at roughly the same speed. Research by other astronomers confirmed the anomalies that Rubin had found and, eventually, based on observations and computer models, and in true Kuhnian fashion a paradigm shift began to take place as scientists concluded that there must be much more matter in galaxies than that which is visible or detectable. They called this material dark matter, and estimated that it accounted for 23% of the matter in the universe.

Dark Matter copy_Page_1 copy

In the 40 years that have followed, scientists still haven’t been able to establish what dark matter is actually made of. But a more recent discovery than Dark Matter, Dark Energy is possibly even more mysterious. In the mid-1990s, two teams of researchers were looking at the speeds of stars to determine how fast the universe was expanding at various points in its lifetime.
 Based on the prevailing paradigm astronomers had predicted two possibilities: either the universe has been expanding at roughly the same rate throughout time, or it has been slowing down in its expansion as it gets older.
 Shockingly, the researchers observed neither. Instead, the expansion of the universe appeared to be accelerating.

If the Big Bang theory is true, all the gravity of all the mass in the cosmos should have been pulling the universe back inward, just as gravity pulls a ball back to Earth after it’s been thrown into the air.
 There was clearly some other force out there operating on a cosmic scale that was counteracting the force of gravity. This force has been called Dark Energy and is estimated to account for 72% of the universe.

Together then, Dark Energy and Dark Matter account for an extraordinary 95% of the matter in the universe. That which we can see, detect and attempt to comprehend, a mere 5%.

Knowledge vs Ignorance

tumblr_n1wdapX6ha1qe3hnzo2_500

With the screening of True Detective in 2014, the medium of television finally confirmed its position as the creative equal – some would say the creative superior – of its older sibling, the cinema.

Traditionally, cinema has always looked down on television as a lesser medium, with actors, writers and studio executives regarding movement from the small screen to the large screen as career advancement and movement from the large to the small as, well, not a good idea….

But that was before the extraordinary commercial and critical success of television series like The Sopranos, The Wire, and Breaking Bad. Not to mention the great Scandinavian mystery sagas like The Bridge or The Killing.

These television series have taken the core strength of the medium – the opportunity to tell stories, develop characters and expand upon themes over a much longer time span than the mere couple of hours that a conventional movie affords – and dramatically exploited this advantage to create an exciting new kind of narrative.

So much so, that critics like Brett Martin in ‘Difficult Men: Behind the Scenes of a Creative Revolution‘ believes that these television drama series have “become the significant American art form of the first decade of the 21st century, the equivalent of what the films of Scorsese, Altman, Coppola, and others had been to the 1970s…”

But really it was only when Hollywood ‘A Listers’ Matthew McConaughey and Woody Harrelson made a play for their own small screen franchise, with True Detective, that it seemed clear that some kind of tipping point had been reached.

In the same year that, McConaughey won his Oscar for Best Actor, he and his good friend Woody Harrelson launched True Detective through the HBO network. Having teamed up with the highly regarded author and screenwriter Nic Pizzolatto, and equally respected film director Cary Joji Fukunaga to produce a drama that features some of the finest writing, acting and cinematography to appear on any screen… large or small.

Combining the two core strengths of each medium – the lavish production values of a full scale cinematic production (remarkably, Cary Joji Fukunaga insisted on shooting the whole drama on film), and the leisurely pace that television affords (the entire drama is eight hours long) – they created a narrative structure that expands on many levels to explore grand themes like the nature of truth, the nature of belief and the nature of space and time through the exploration of a number of damaged male characters set amid the poisoned and polluted post industrial landscapes of the Gulf Coast of Louisiana.

True Detective is many things at once—a powerful character study of damaged people in damaged landscapes, a gripping murder mystery, a tour de force for Woody Harrelson and Matthew McConaughey. But first and foremost it is about knowledge and ignorance.

There is a scene at the heart of True Detective that shows this in a startling and profoundly moving way.

Rust Cole, played by Matthew McConaughey, is an ex-cop from Texas and a flickering ghost of a man. Still haunted by the loss of his two year old daughter, who died tragically more than twenty years ago, he is now a disheveled looking alcoholic, who is being interviewed by two Louisiana detectives, who want to know about a murder investigation that he was involved in seventeen years earlier.

We learn that the case notes have all been lost in the wake of Hurricane Rita, and that we now have to rely on the testimony of Cole – and his ex-partner Marty Harte, whom the two detectives interview separately – to understand what happened all those years ago.

Cohle treats his interviewers with the disinterested disdain of a man who has ventured far beyond the realms of conventional society, and has little need of its illusory comforts or, indeed, any of its social niceties.

Knowing how much they need his testimony, he insists on being allowed to smoke in their non-smoking office, and drink his choice of Texan beer, before he explains things any further.

In contrast, ‘Rust’ Cole’s ex-partner, Marty Harte, comes across as a regular guy and a good-ole-boy with simple Bible-Belt family values. Unlike Cohle he appears to be a well-adjusted member of society, who exudes bonhomie and treats his interviewers with genial professionalism.

During the course of the two interviews, however, we begin to see that Cohle and Harte are not all that they seem… ‘Rust’ Cole is a detached, outsider figure who seeks knowledge at any price. Intolerant of any kind of falsehood or superstition, he wants to know the truth, no matter how uncomfortable, inconvenient or downright dangerous that truth might turn out to be.

Marty Harte, on the other hand, lacks knowledge… not just of the world around him but, more importantly, he lacks knowledge of himself. Critically, he fails to understand those who should be closest to him, including his wife – to whom he is unfaithful – and his two daughters – from whom he is profoundly absent even when he is present in their company.

With a stack of empty beer cans in front of him, ‘Rust’ Cole asks the two detectives if they are familiar with M Theory – the latest attempt by theoretical phycists to create a theory of everything, and reconcile the conflicting requirements of quantum mechanics and general relativity.

Cohle says: You ever heard of something called the M-brane theory, detectives?
Detective 1 responds non-commitally: No. That’s over my head.
Cole says: It’s like in this universe, we process time linearly forward but outside of our spacetime, from what would be a fourth-dimensional perspective, time wouldn’t exist, and from that vantage, could we attain it we’d see our spacetime would look flattened…

At this point Cole, with an empty beer can in one hand, extends his arms, brings his hands together and smashes the can into a flat silver disc. Cole continues: …like a single sculpture with matter in a superposition of every place it ever occupied, our sentience just cycling through our lives like carts on a track. See, everything outside our dimension that’s eternity, eternity looking down on us. Now, to us, it’s a sphere, but to them it’s a circle.

We cut to Marty Hartes two daughters aged seven or eight, sitting on the front lawn of their suburban house. They are dressed in princess outfits – perhaps they are waiting for their dad to come home – but they have grown bored and are now bickering with one another. In frustration, one of the girls grabs the others tiara and throws it high up into the tree above them. The camera follows the tiara up into the tree and waits. And waits. We sense that time has passed, before eventually the camera comes back to its original position to see that the lawn is now empty. A car pulls up into the driveway, the passenger door opens and a clatter of empty beer cans fall out, followed by a teenage girl who is so drunk she can barely stand.

It is Marty Harte’s daughter.

When Cole first starts talking about M Theory, you’d be forgiven for thinking that this was nothing more than the ramblings of the town drunk, happy to waste police time for a few free beers. However, the film language tells us something very different. By telescoping time, as if to echo the collapsing of the beer can, the edit shows us what Cohle knows and Harte doesn’t: that time spent away from his young family is time that can never be recovered and that this ignorance will have far reaching consequences for all of them.

The missing particle

Just a few kilometers to the North East of Geneva, and nestling in the foothills of the Jura Mountains, there is a massive underground nuclear research facility, which would not look out of place in a James Bond Movie. It is called CERN (The European Organization for Nuclear Research) and it is here that some of the world’s most brilliant scientists – guys who like to invent things like the internet in their spare time – are probing the fundamental structure of the universe. Here, in a vast 27-kilometer circumference underground complex, rather modestly called the Large Hadron Collider they are smashing particles of matter together at close to the speed of light in order to provide insights into the fundamental laws of nature, and specifically to confirm the existence of an elusive sub-atomic particle known as the Higgs Boson.

The Large Hadron Collider took about a decade to construct, for a  total cost of about $4.75 billion. Electricity costs alone for the LHC run about $23.5 million per year, and the total operating budget of the LHC runs to about $1 billion per year. The search for the Higgs Boson involved the work of almost 3,000 physicists from 169 institutions, in 37 countries and five continents

On the 4th of July 2012 Fabiola Gianotti, Italian particle physicist, and spokesperson for the project at the Large Hadron Collider announced success. But as soon as she did so, she said that: “We need more data.” As Stuart Firestein, who chairs the Department of Biological Sciences at Columbia University, puts it in Ignorance: how it drives science

“We need more data.” With these words, Fabiola Gianotti wrapped up the triumphant announcement that the elusive Higgs boson particle had been detected. Gianotti is the physicist in charge of the experiment at the Large Hadron Collider where this unveiling was made. She added “surprise, surprise” to the end of that sentence, not as a damp squib, or faux humility, nor a beg for more grant money. She said these words because she understands that science is a process not a bank of knowledge…

As a culture, we have a voracious appetite for information. And it is perhaps the ultimate irony, that just as the digital revolution has given each and every one of us easier and faster access to exponentially larger and larger amounts of information, we are only now beginning to become aware of just how much we do not know, and in fact, just how much we cannot know.

Put simply, we are beginning to realize that larger and larger amounts of information do not necessarily guarantee larger and larger amounts of knowledge – indeed there is evidence to suggest that in many cases the opposite may often be true. This is a phenomenon that is often described as ‘the illusion of knowledge.’

The illusion of knowledge and its counterpart, the ignorance of ignorance, are two of the most important philosophical ideas of the digital age. They have found their simplest and best articulation in the rather unlikely form of a statement by Donald Rumsfeld – the then United States Secretary of Defense – when he responded to a journalist at a press briefing in February 2002, who had asked him about the lack of evidence for weapons of mass destruction in Iraq. What Rumsfeld said was:

‘…there are known knowns; there are things that we know that we know. We also know there are known unknowns; that is to say we know there are some things we do not know. But there are also unknown unknowns, the ones we don’t know we don’t know.’

Despite being the subject of much derision at the time – The Plain English Campaign gave Rumsfeld its Foot in Mouth Award – Rumsfeld’s statement has come to be seen by many as being (however unintentionally) one of the best summaries of the problem of quantifying ignorance, and the impact that ‘unknown unknowns’ may have on our world.

‘Unknown unknowns’ or the ignorance of ignorance are central to the thinking of Nassim Taleb and his use of 19th century philosopher John Stuart Mill’s metaphor of the black swan.

‘A black swan’ was a common expression in 16th century London as a statement of impossibility, much like the way we use the phrase ‘flying pigs’ today. This was based on the presumption that because all swans ever observed in the Northern hemisphere were white, ALL swans in the world must therefore be white. After the Dutch explorer Willem de Vlamingh discovered black swans in Western Australia in 1697, the term metamorphosed to mean that a seeming impossibility might later be disproven.

The idea of Black swan events – ones which cannot be predicted but have a massive impact on human history – were introduced by Nassim Nicholas Taleb in his 2001 book Fooled By Randomness. His 2007 book The Black Swan extended the idea and showed that almost all major scientific discoveries, historical events, and artistic accomplishments are, in fact, “black swans”. The Internet, the personal computer, World War I, the dissolution of the Soviet Union, and the September 11 attacks are all examples of black swan events.

As a Lebanese, whose family home was destroyed by an unforeseen war Nassim Taleb, is more aware than most of the dangers of ignoring ‘unknown unknowns’, and refers to our attitudes to ignorance, as epistemic arrogance or ‘our hubris concerning the limits of our knowledge.’

Stuart Firestein, eloquently describes how the presence of these ‘unknown unknowns’, means that science can never be a finite process, but an ongoing process that is constantly revising itself:

Science, then, is not like the onion in the often used analogy of stripping away layer after layer to get at some core, central, fundamental truth. Rather it’s like the magic well: no matter how many buckets of water you remove, there’s always another one to be had. Or even better, it’s like the widening ripples on the surface of a pond, the ever larger circumference in touch with more and more of what’s outside the circle, the unknown. This growing forefront is where science occurs… It is a mistake to bob around in the circle of facts instead of riding the wave to the great expanse lying outside the circle.

Three decades ago, Stephen Hawking famously declared that a “theory of everything” was on the horizon, with a 50 per cent chance of its completion by the year 2000. In 2002, Stephen Hawking gave a lecture entitled Godel and the End of Physics, in which he described how he no longer believed that a “Theory of Everything” was possible, given ‘Gödel’s Incompleteness Theorems’, which show that mathematics can either be consistent or complete. And that it could never be both.

In The Grand Design, written with Leonard Mlodinow, Professor Stephen Hawings explains how, in the early 1990s, string theory was struggling with a multiplicity of separate and distinct theories. In fact instead of a single theory of everything, there seemed to be five. Beginning in 1994, though, physicists noticed that, at low energies, some of these theories were mathematically equivalent to one another, suggesting that they may just be two descriptions of the same thing. Eventually, one string theory was shown to be mathematically equivalent to 11-Dimensional Supergravity, a theory that described not only strings but membranes, too. Many physicists now believe that this supergravity theory is actually one piece of a hypothetical ultimate theory, which they call M-theory, of which all the different string theories offer us the merest glimpses.

Thus, according to Mlodinow and Hawing the only way to understand reality is to employ a philosophy called “model-dependent realism” in which we cannot ever attain a single comprehensive theory of the universe. Instead, science offers many separate and sometimes overlapping windows onto a common reality.

Great science is not just there to answer questions, and provide explanations. Indeed, the works of Gödel, Schrodinger, Eisenberg, and Einstein have, perhaps, all raised greater questions than they have answers.

And that is something we should all celebrate rather than deny. For as Thomas Kuhn has shown, so much of experimental physics is simply puzzle solving within an existing paradigm, whereas what we should be doing is exploring the inconsistencies that may point the way towards a new paradigm. The Late Victorians had longed for a science that would provide the answer to life, the universe and everything. And although this yearning was to find its perfect expression in the fiction of the scientific detective, it would never be realized in the real world of experimental science.

Unlike detective fiction, whose purpose is to solve and explain puzzles, perhaps the ultimate purpose of great science is exactly the same as any great detective story – not to solve mystery… but to create it.

As Professor Stephen Hawking puts it:

‘Some people will be very disappointed if there is not an ultimate theory that can be formulated as a finite number of principles. I used to belong to that camp, but I have changed my mind. I’m now glad that our search for understanding will never come to an end, and that we will always have the challenge of new discovery. Without it, we would stagnate. Gödel’s theorem ensured there would always be a job for mathematicians. I think M theory will do the same for physicists…’

Categories
Uncategorized

Brer Rabbit and the ghosts of slavery

With palm trees swaying gently in the breeze and surf crashing upon miles and miles of golden beaches, Cape Coast is without doubt, one of the most beautiful places in West Africa.

But there is a long dark shadow that falls across these golden sands.

For here, standing on a rocky promontory like some brooding African Elsinore is an ancient fortification called Elmina Castle, which for almost four hundred years was an important link in the North Atlantic slave trade.

Elmina Castle was built by the Portuguese in 1482 as São Jorge da Mina Castle, and is the oldest European building south of the Sahara. First established as a trade settlement, the castle later became the portal through which millions of Africans were forced to pass on their way to the slave markets that lay over the horizon in the Americas.

A number of years ago I travelled here with a good friend of mine, a West Indian who had been born on the small Carribean island of St Vincent, and whose ancestors had been slaves. Upon our arrival we met a local teacher called George who offered to show us around.

The castle itself is a dark, oppressive place. The pain and the suffering of hundreds of years of human trafficking, seems to be soaked into the very fabric of the walls.

Within the bowels of this castle is a doorway that is known as “The Gate of No Return.” Looking out from the darkness, through this doorway you can see the surf crashing on the golden beaches below… this doorway feels like a portal to another world.

And for millions of Africans it was just that.

As the American writer, Emily Raboteau, puts it so powerfully in a piece called “The Throne of Zion. A Pilgrimage to São Jorge Da Mina, Ghana’s Oldest and Most Notorious Slave Castle”:

This, then, was the door. It struck me as vaginal. You passed through it and onto a ship for Suriname or Curaçao,… You passed through it, lost everything, and became something else. You lost your language. You lost your parents. You were no longer Asante or Krobo, Ewe or Ga. You became black. You were a slave. Your children inherited your condition. You lost your children. You lost your gods, as you had known them. You slaved.

And born into a new life, all that they carried with them were their memories and their stories. Everything else they were forced to leave behind.

Eventually we had had enough of this dark place, and we were all quite relieved to get out into the late afternoon sunshine. George suggested a place a little way back down the coast where we could get a cold beer.

Soon we were sitting outside a small wooden bar on the beach, with the dark castle cast in silhouette as the the sun set beyond over the vast Atlantic Ocean.

As the light began to fall George began to talk about the local culture and in particular, he started to tell stories about the spider god Anansi.

Surprisingly, it soon emerged that my friend knew similar stories, having learned them from his grandmother, as a child on the island of St Vincent.

Winston told one of his Anansi stories. Then George told one of his. And then Winston responded with another.

This went on for a while, until something quite extraordinary happened:

Winston told a particularly funny Anansi story…

And George was laughing because it was one that he had never heard before.

And at that moment we understood.

This story must have been carried over the Atlantic ocean to the Carribean island of St Vincent at some point in the past four hundred years. – probably after passing through the castle which stood so ominously behind us in the gathering darkness.

It had been passed down, from generation to generation, until Winston brought it back back to Africa, to share with us that evening.

The fact is, that these Brer Rabbit, Anansi stories have the most amazing ability to travel across vast swathes of space and time. And media.

Which is why these trickster tales are alive and well, and still being shared on a daily basis.

Despite the fact that many of the original books are out of print and the movie called “Song of the South” is deemed by the executives at Disney to be too politically sensitive to be re-released. And despite the fact that here have been many attempts to make the stories more socially acceptable to by removing the Uncle Remus character and the use of the Gullah language. These stories are flourishing, not in traditional media, but in that original social media… the shared oral tradition.

The rabbit who survived the Black Holocaust, may well have a few more surprises for us yet.

Nobody knows exactly how old Brer Rabbit really is, but he is clearly many, many hundreds of years old.

He was first smuggled across the Atlantic in the stories told by African slaves, to America, where he found fame and fortune in popular books and movies, going on to become a character beloved by generations of children around the world.

In more recent years, these books and movies have become mired in arguments about political correctness and all but disappeared from the popular imagination. But, remarkably, the ancient oral storytelling tradition that gave birth to this character, keeps his adventures alive to this very day.

The Atlantic slave trade was a human tragedy on a scale like no other. The “Black Holocaust” or “Maafa” (a word derived from the Swahili term for “disaster”, or “great tragedy”) lasted for almost four hundred years, and although we have no way of knowing exactly how many people died as a result, many modern historians estimate a staggering death toll of at least ten million men, women and children.

The most deadly part of the journey was the notorious “Middle Passage” where prisoners were held below decks in slave ships for months as they crossed the Atlantic Ocean.

Despite the apalling conditions in which they were transported, it is thought that around eleven million Africans survived the journey to become slaves in the Americas.

The majority came from the west coast of Africa, and they came from at least 45 separate racial groupings. These included, the The BaKongo, The Mandé, The Akan, The Wolof, The Igbo, The Mbundu and The Yoruba to name but a few.

Mostly these people would have spoken one of the Niger Congo family of languages (these days, some 85 percent of the population of Africa speak a Niger-Congo language). However, it is estimated that there are at least 1,400 of these Niger-Congo languages.

Huddled together, in chains, in the darkness of the great slave ships, many of these people could not even talk with one another.

Over the years, West African Pidgin English, also called Guinea Coast Creole English, became the lingua franca along the West African coast.

This language began it’s life among Slave traders doing business along the coast, but it quickly spread up the river systems into the West African interior, because of its value as a common trade language among different tribes; even amongst Africans who had never have seen a white man.

It is still spoken to this day in West Africa.

Slaves in the Americas found West African Pidgin English as useful as a common language on the plantations as they had found it back home in West Africa as a trading language. And when they had children, these too adopted their own version of West African Pidgin English as their native language, thus creating a number of American English-based creole languages.

One of these creole languages is called Gullah and is still spoken today by about 250,000 people in the Southern United States, specifically, on the coasts of South Carolina and throughout the State of Georgia.

And it was in the language of Gulah, that a young Irish American called Joel Chandler Harris was to first hear, the animal stories, and songs, that were to bring him worldwide fame with the tales of Brer Rabbit.

Joel Chandler Harris was a journalist who wrote for a newspaper called “The Constitution” in Atlanta, Georgia, in the years immediately following the American Civil War. A war that had destroyed so much of the South, but wreaked devastation on Atlanta in particular.

Harris published his first Brer Rabbit tale, “The Story of Mr. Rabbit and Mr. Fox as Told by Uncle Remus”, in a phonetic version of the Gullah language, in the July 20, 1879 issue of the newspaper, under the heading “Negro Folklore”. He would publish 184 more of these tales during the next 27 years.

Becoming a household name, not just across the States and but also around the world with readers who delighted in these strange tales told in the creole language of Gullah.

Because of this, Joel Chandler Harris’s position amongst American men of letters at the start of the 20th century was second only to that of Mark Twain.

And his influence on other writers was equally far reaching; the children’s literature analyst John Goldthwaite has said that the Uncle Remus tales are “irrefutably the central event in the making of modern children’s story.” In terms of content, their influence on children’s writers such as Rudyard Kipling, A.A.Milne, Beatrix Potter, and Edith Blyton is substantial. Not to mention their stylistic influence on modernism in the works of Pound, Eliot, Joyce, and Faulkner.

And yet, today, few children would recognize the name Uncle Remus, let alone that of Joel Chandler Harris.

In the late 1960s most Brer Rabbit books were removed from schools and libraries in the States because they were deemed racist. And despite the enduring popularity of the signature song “Zip-a-Dee-Doo-Dah”, the Disney movie, “Song of the South”, which was based on these stories, has not been seen in it’s entirety for over fifty years. And never been released on home video or DVD.

In 1981 the writer Alice Walker , author of “The Colour Purple”, accused Harris of “stealing a good part of my heritage” in a blistering essay called “Uncle Remus, No Friend of Mine”. Strangely enough, and to be fair to Joel Chandler Harris, he would probably have agreed with much of what Alice Walker had to say.

Cruciallly, Harris saw himself as an ethnographic collector of the oral traditions of these former slaves rather than an original author of fictional literature in the style of Mark Twain. His tales were roundly praised by leading folklore scholars of the day. He became intrigued with the new “science of ethnology” and became a charter member of the American Folklore Society (along with Twain). As he began to fill his library with ethnological texts, journals and folklore collections, he become intrigued by the fact that the tales he was collecting bore striking resemblances to tales from cultures in other parts of the world.

Which they clearly do.

In English “Brer Rabbit” means “Brother Rabbit”. As indeed, “Brer Fox”, “Brer Bear”, “Brer Wolf” and “Brer Buzzard” are in fact: ” Brother Fox”, “Brother Bear”, “Brother Wolf” and “Brother Buzzard”.

As such, the names of these characters betray their very ancient origins in Western Africa.

As Karen Armstrong has pointed out in her brilliant “Short History of Myth”, pre-agrarian, hunter gatherer societies exhibit a strong sense of identification with all living creatures, particularly those that are hunted for food. Seeing all animals as siblings is a common expression of this perception.

Brother Rabbit, is a trickster. And as such is also another iteration of Brother Spider, or Anansi. Brer Rabbit tales, like the Anansi stories, depict a physically small and vulnerable creature using his cunning intelligence to prevail over larger animals. Brer Rabbit, originated from the folklore of the Bantu-speaking peoples of south and central Africa.Whereas, the Anansi tales which are some of the best-known in West Africa are believed to have originated in the Ashanti people in Ghana.

Although, many Brer Rabbit and Anansi stories are easily interchangeable, they often took on a whole new level of meaning on the plantations.

In the introduction of the first volume, Harris wrote: “…it needs no scientific investigation to show why (the Negro) selects as his hero the weakest and most harmless of all animals, and brings him out victorious in contests with the bear, the wolf, and the fox.” And Brer Rabbit, born into this world with “needer huff ner hawn” – neither hooves nor horns – has to use trickery to survive. The enjoyment of his amoral, and immoral, adventures, being made all the more fun as a thinly veiled code for the black slave out-foxing his white masters.

It has been said that these stories were usually told by one adult to another. And children, if they were lucky would get to listen in.

And the adult tone of many of the stories reflects this. Stealing, lying, cheating,torture savage beatings, and even cold-blooded murder are normal fare for what has been described as “this malevolent rabbit”.

Take “The Sad Fate of Mr. Fox,” in which Brer Rabbit not only tricks Brer Fox into getting himself beaten to death by Mr. Man, but then takes Brer Fox’s severed head back to his wife pretending that it’s beef for her soup pot. Or another story which has Brer Rabbit slowly scalding Brer Wolf to death, while another has him killing Brer Bear by engulfing him in a swarm of bees.

Several stories even touch on sex as a theme, usually with Brer Rabbit beating Brer Fox and the other animals for the attentions of “Miss Meadows and de gals,” who then make merry in a thinly disguised brothel.

Br'er Fox and Br'er Bear from Uncle Remus, His...

But perhaps no better tale demonstrates Brer Rabbit’s supreme wickedness than “Mr. Rabbit Nibbles Up the Butter.” In which “lumberin'” Brer Possum gets burned to death in his own fire. The little white boy, who is listening to Old UncleRemus tell this dark tale, protests indignantly that since Brer Rabbit stole the butter, he should be the one to be punished for it, not poor Brer Possum. To which Remus shrugs and says: “In dis worl’, lots er folks is gotter suffer fer udder folks sins.”

Categories
Artificial Intelligence Bandwidth of Consciousness Communication Information Theory Judging Creativity Metaphor Narrative Subliminal Communication The Creative Process

Towards a new definition of creativity

It is said that at the very edge of medieval maps, the mapmaker would write the Latin words ‘Hic sunt dracones’ or ‘Here be dragons’ to describe an area which is beyond the limits of human knowledge.

In 2013 it was estimated that the global creative economy was worth more than US$ 624 billion and over the previous decade had grown, on average, more than 8 per cent every year.

Creative Industries Mapping projects have been rapidly deployed in countries all over the world in an attempt get a clearer picture of this, recently discovered, rapidly growing economic sector.

And yet each of these attempts to map the creative sector has one major omission. This omission is not on the periphery, as with ancient maps of the world, but right at the very centre; and this is the lack of a clearly defined understanding of what creativity is in the first place.

The reason for this is that creativity has its origins in a part of the human mind that, until recently, we have known precious little about: and that is the unconscious.

Over the last few years, an extraordinary revolution has taken place in our understanding of the role of the unconscious, and now, with the help of some of the latest discoveries in neuroscience we can at last begin to map this mysterious unmapped area.

But to do this we need to venture deep into the human imagination. Here be dragons indeed…

the engine of cultural change

Creativity is the engine of cultural change, a force that can rapidly transform the lives of individuals as well as the societies within which they live.

It is not surprising therefrore, that creativity has become a key strategic priority for corporations, educational institutions and governments all over the world.

According to a 2010 report from the British-based multinational bank, Standard Chartered, the most productive economies in the world will, in the future, be those that successfully leverage the creativity of their workforces, the report concluded that:

‘Creativity may be the most powerful of all the resources to be rich in. With vast numbers of people entering the workforce, huge improvements in productivity, and continued globalisation, the rewards for innovation and creativity will become even greater’.

In 2013, world trade in creative goods and services totalled a record US$ 624 billion in 2011, more than doubling in the period 2002 to 2011 with an average annual growth rate during that period of 8.8 per cent.

Because of this unprecedented growth in the Creative Economy, most countries have engaged in Creative Economy Mapping projects to try and gauge the scale and value of this sector.

In 2001, the UK became the first European country to launch an annual Creative Industries Mapping project, defining the sector as:

those industries which have their origin in individual creativity, skill and talent and which have a potential for wealth and job creation through the generation and exploitation of intellectual property

By 2013 they estimated that the sector was growing much faster than the rest of the economy and was worth a staggering £76.9 billion and accounted for 5.0 per cent of the total economy. Furthermore they also estimated that the creative economy now employed more than 1.8 million people, the equivalent of 1 in 12 jobs.

This widespread recognition of the strategic importance of creativity has, in turn, led to urgent calls for a radical overhaul of the education system. Sir Ken Robinson a leading British educationalist makes the case for a new kind of system in the TED talk: How Schools Kill Creativity, and it’s a measure of interest in the subject that this talk has attracted more than 33 million views, making it the most popular TED talk ever.

In Creative Schools: The Grassroots Revolution That’s Transforming Education Sir Ken explains that the educational system we have today is a relic of the Industrial Revolution, created, as it was to educate the emerging bureaucratic class:

Industrialism needed a lot more manual workers than it did college graduates. So mass education was built like a pyramid, with a broad base of compulsory elementary education for all, a smaller sector of secondary education, and a harrow apex of higher education.

Richard Florida in The Rise of the Creative Class makes a similar point when he shows how the old bureaucratic class created by the Industrial Revolution has been gradually replaced by by a new class of creative workers that he calls ‘The Creative Class’. As a result, Florida argues, we are currently witnessing the transition from the Industrial Age to a new Creative Age.

Given the fact that we live at a time when creativity is seen to be a matter of such central importance, it is curious that our ideas about what creativity is, and where it comes from, remain so curiously vague and ill defined. 

I believe that there are two key reasons for this lack of clarity.

Firstly, the concept of creativity is a relatively new cultural phenomenon.  Indeed, the word was hardly ever used before the 1950’s… with the aid of Google’s Ngram viewer you can clearly see that the word hardly registers at all for the first fifty years and then increases steadily throughout the latter part of the 20th century.

bandwidth slides.001

The second reason is that creativity has its origins in a part of the human mind that, until recently, we have known precious little about: and that is the unconscious.

Over the last few years, an extraordinary revolution has taken place in our understanding of the role of the unconscious, made possible by one new technology in particular: Functional magnetic resonance imaging or fMRI. Emerging in the 1990s fMRI offers three-dimensional pictures of what takes place in the brain as it happens. For the very first time, in human history, this allows us the opportunity to link the activity of the abstract entity that is the human mind with the measurable activity of the physical brain.

The widespread adoption of fMRI has led to the development of a new scientific discipline called Social Neuroscience, which along with Behavioural Economics, has begun to reveal the extraordinary role the unconscious plays in our daily lives.

We humans like to think of ourselves as uniquely rational beings and human thought, as a deliberate, conscious activity; and yet in recent years, the combination of these two disciplines has increasingly forced us to acknowledge that all human thought is, for the most part driven by unconscious processes.

And nowhere is the impact of this insight more significant than in the field of creativity.

THE TRADITIONAL DEFINITION OF CREATIVITY

The Encyclopedia Britannica defines creativity as:

‘The ability to make or otherwise bring into existence something new, whether a new solution to a problem, a new method or device, or a new artistic object or form.’

However, when you examine this definition a little more closely, you can see that it actually contains two separate definitions of creativity; on the one hand there is ‘a new solution to a problem, a new method or device’ and on the other there is ‘a new artistic object or form’.
These two types of creativity are both fundamentally different, and one of the biggest problems with any discussion of creativity is that we tend to confuse them.

For this reason, I believe it best to create a distinction between the two types of creativity, by describing them as Technical Creativity on the one hand and Expressive Creativity on the other.

THE TWO TYPES OF CREATIVITY

Technical Creativity and Expressive Creativity tend to have very different cultural associations.

Technical Creativity being the type that delivers ‘a new solution to a problem, a new method or device’, is closely associated with innovation and invention.

A centre for Technical Creativity would be somewhere like Silicon Valley where the entrepreneurial engineering culture tends to focus on innovation and invention rather than individual self-expression.

Expressive Creativity, on the other hand is the kind of creativity that we tend to think of when we think of art, music, dance literature or film; this is essentially creativity as communication, and, generally speaking, does not exist without an audience.

A centre for Expressive Creativity would be somewhere like Hollywood where the actors, writers, set designers, musicians, cinematographers and directors display a greater interest in self-expression and communication than in innovation and invention.

If the world of Technical Creativity tends to be more rational and mechanical, the world of Expressive Creativity tends to be more intuitive and emotional. For this reason, the impact of Technical Creativity tends to be tangible and measurable whereas the effects of Expressive Creativity tend to be intangible and much harder to measure – indeed as we shall see, this is because the effects of Expressive Creativity are largely unconscious.

bandwidth slides.006
THE NEW UNCONSCIOUS AND THE BANDWIDTH OF CONSCIOUSNESS

Our understanding of the nature of the unconscious has radically changed in recent years; so much so, that many prefer to call it ‘The New Unconscious’.

Of course, the name that is most commonly associated with the traditional view of the unconscious is that of Sigmund Freud; it was his dark vision of a seething morass of repressed desires that mysteriously influence behaviour that was to dominate popular understanding of the subject for the first half of the 20th century.

Recent studies, however, have shown that the unconscious has a far greater influence on how we think and behave than even Freud could have ever imagined.

Every waking second, our brains sift through millions of bits of data in order to make sense of the world around us; and we now know that the vast bulk of this processing takes place in the unconscious.

The Unconscious, it seems, is the Central Processing Unit of the human brain and It is estimated that it processes 12 million bits of sensory data every second. In contrast, it is estimated that the Conscious mind can only process around 40 bits of information a second.

The modest little 40-bit processor that is our conscious mind is, therefore, capable of handling only the most infinitesimal fraction of the 12 million bits of data that flood in from the senses every second.

This is why the vast majority of what we experience is processed unconsciously, and only referred to the conscious mind on a ‘need to know’ basis.
The predominance of the unconscious over the conscious mind was dramatically demonstrated recently with research which shows that at least seven seconds before you consciously make a decision, the outcome can be predicted from the unconscious activity in your brain.

In other words, the unconscious ‘makes up its mind’ a full seven seconds before we are consciously aware that we have decided to do something.

What’s more, by analysing the unconscious brain activity produced while making the decision, the researchers could predict, with 100% accuracy, what choice people would make, seven seconds into the future, before they themselves were even aware of having made a decision.

It has been widely noted, therefore, that the conscious brain is not – as we have always thought – an executive function like the White House Oval Office that is making all the decisions, rather, it is more like the White House Press Office issuing explanations for decisions that have already been taken elsewhere.

The discovery that the executive function, at the heart of human consciousness, is to be found, not in the conscious mind, but in the unconscious, is a paradigm shift, and one that fundamentally affects our understanding of our place in the cosmos. Indeed, it would be no understatement to say that this discovery is on a scale with the paradigm shift that occurred when humans first discovered that it was the Earth that revolves around the Sun rather than the Sun that revolves around the Earth. The implications for human consciousness and human creativity are equally momentous.

THE ROLE OF THE NEW UNCONSCIOUS IN THE CREATIVE PROCESS

The more complex the creative problem to be solved, the more important the part that the unconscious plays.

The conscious mind as we have seen, can only process information at around 40 bits per second; so in order to solve a complex problem, we need to use the far more powerful unconscious, which can process information at 12 million bits per second.

One of the first to recognise the importance of the unconscious in the creative process was the French mathematician and theoretical physicist Henri Poincaré (1854 –1912). Through a process of trial and error, Poincare found that consciously considering a complex problem, would only get him so far and that it was only after putting the task aside for a while, that the answer would come to him. Saying ‘it is by logic that we prove, but by intuition that we discover’, Poincaré developed a daily routine based on this insight, consciously working for just two hours in the morning and two hours in the evening, giving over the remainder of the day for the unconscious to mull things over.

Using this technique, and by harnessing the processing power of the unconscious mind, Poincare was able to imagine mathematical structures of the most extraordinary complexity.

Indeed, his ideas went on to form the basis of Complexity Theory, a branch of mathematics that could only begin to be visualised in the late 1970’s when computers were finally invented with enough processing power to bring Poincaré’s ideas to life.

1920px-Mandel_zoom_08_satellite_antenna
By harnessing the power of the unconscious mind, Poincare was capable of imagining mathematical structures that could only be visualised half a century after his death, with the arrival of computers with significant processing power in the late 1970’s. Pictured is one such computer generated image known as a ‘Mandelbrot Set’.

Poincaré’s ideas about creativity were to influence an entire generation, including such important Modernist figures as Pablo Picasso and Albert Einstein, but one person who was particularly influenced by Poincaré’s understanding of the role of the unconscious in the creative process, was an English social psychologist called Graham Wallas (1858 – 1932).

In 1926, Wallas published a book called The Art of Thought, in which he presented a model of the creative process, based on Poincaré’s method, which consisted of four consecutive, stages:

1) PREPARATION: The stage at which the problem is consciously ‘investigated in all directions’
2) INCUBATION: The stage at which the problem is put to one side in favour of unconscious processing
3) ILLUMINATION: The stage at which the creative idea surfaces into conscious awareness
4) VERIFICATION: The stage at which the idea is consciously verified, and then applied.

With minor modifications, this four-stage process of PREPARATION, INCUBATION, ILLUMINATION and VERIFICATION has since been widely recognised as the standard model for the creative process.

bandwidth slides.001
THE BANDWIDTH OF CREATIVITY

As Wallas’s four stage process demonstrates, creative ideas will appear after a conscious period of PREPARATION, followed by an unconscious period of INCUBATION; and the amount of time and effort that has been  devoted to the problem at these two critical stages is usually reflected in the scale, complexity and ultimately value of the result.

Using what we now know about the ‘Bandwidth of Consciousness’- i.e.that the conscious mind can only process around 40 bits of data every second whereas the unconscious processes around 12 million – it is possible, therefore, to judge the value of ideas on the basis of where they sit on a spectrum of Creative Bandwidth, according to the amount of unconscious processing power that has been applied.

Thus, an extremely low bandwidth idea is one that can be produced quickly and easily, and primarily with the use of the 40 bits per second conscious mind whilst, at the other end of the scale, an extremely high bandwidth idea is one produced by the 12 million bits per second non-conscious mind (where time and expertise will need to be invested at all stages of the process).

bandwidth slides.002

Low Bandwidth creativity tends to be formula or template based. For example, creating a picture with a ‘Painting by Numbers’ kit can be great fun and allow even the most artistically challenged person to create a painting, but the process requires little in the way of unconscious effort.

The other extreme – because it makes full use of he kind of creativity that emerges out of the depths of the unconscious, like writing a novel or a screenplay – can all be characterized as High Bandwidth Creativity.

In terms of Expressive Creativity, the more High Bandwidth it is, and the greater the unconscious processing that has taken place, the more likely the communication is to engage on a subliminal level.

In attempting to create a meaningful map of the new Creative Economy, the concepts of High Bandwidth and Low Bandwidth Creativity, are useful in that they provide the means of evaluating the potential cost as well as the possible financial value of an abstraction like creative work, in the absence of any other tangible factors.

So, for example, we can see that Low Bandwidth creativity tends to be formula or template based, and should be easier to produce and therefore cheaper than High Bandwidth Creativity.

bandwidth slides.012
METAPHOR AND SUBLIMINAL COMMUNICATION

A fascinating demonstration of how the information we receive ‘under the radar’ of consciousness affects both our attitudes and our behaviours was published in a paper called Ovulatory Cycle Effects On Tip Earnings By Lap Dancers.

This research showed that Lap Dancers would earn dramatically more in the way of tips ($335), when they were at their most fertile, compared with ($185) when they were menstruating. (Intriguingly, participants using oral contraception showed no such variation.)

Because the dancers are never allowed to talk to customers, it is clear that they communicate their fertility through dancing, or some other, subliminal means.

We have known for many years, through the study of body language that the vast majority of what we communicate face to face is non-verbal and unconscious. With the aid of fMRI scans, we can now see that this is also true for all human communication.

When we experience High Bandwidth Expressive Creativity in the form of a movie or music, whilst we may process the subject of the movie and the sounds of the music consciously, it is the subliminal elements, the ones that speak to us ‘below the radar’ of consciousness that determine how we actually experience the work.

David Byrne, the musician who fronted the band Talking Heads, describes this phenomenon in his recent book called ‘How Music Works’:

‘Music tells us things — social things, psychological things, physical things about how we feel and perceive our bodies… It’s sometimes in the words, but just as often the content comes from a combination of sounds, rhythms, and vocal textures that communicate… in ways that bypass the reasoning centers of the brain and go straight to our emotions. Music, and I’m not even talking about the lyrics here, tells us how other people view the world — people we have never met, sometimes people who are no longer alive — and it tells it in a non-descriptive way. Music embodies the way those people think and feel: we enter into new worlds — their worlds — and though our perception of those worlds might not be 100% accurate, encountering them can be completely transformative.’

These things ‘that music tells us’ are essentially non-verbal and are the result of the unconscious mind of one person communicating directly with the unconscious mind of another.

One of the most important ways that this happens is through the medium of metaphor.

The word ‘metaphor’ comes from the Greek ‘μεταφέρω’, which means ‘to transfer across’. Luggage trolleys in Athens airport feature the word ‘‘μεταφέρω’, since they transfer luggage from one place to another. Just like these luggage trolleys, metaphors are what we use to transfer meaning from one thing to another.

DL Presentation.002

Metaphors appear to be a fundamental part of human cognition and communication. Human languages are composed for the most part of metaphors, even though we are largely unconscious of them. It is estimated that we use one metaphor for every ten to twenty-five words we speak.But metaphor is not limited simply to language. James Geary, who has written extensively on the subject of metaphors, says:

Metaphorical thinking—our instinct not just for describing but for comprehending one thing in terms of another… shapes our view of the world, and is essential to how we communicate, learn, discover, and invent. Metaphor is a way of thought long before it is a way with words.

Thanks to a curious cognitive function of the human brain called ‘synaesthesia’ what we experience in one sense, like sound, can be metaphorically experienced in another separate sense like the vision.

This is known as ‘The Bouba-Kiki Effect’.

Shapes
So which is ‘Bibi’ and which is ‘Kiki’?

In 2001 neuroscientist Vilayanur S. Ramachandran and cognitive psychologist Edward Hubbard published a paper called Synaesthesia—A Window Into Perception, Thought and Language. Making up abstract names for two abstract shapes, the two researchers asked respondents ‘Which of these shapes is Bouba and which is Kiki?’. 98% of people selected a bulbous curvy shape as ‘Bouba’ and a spiky jagged one as “Kiki”. The words Bouba and Kiki were made up and therefore had no meaning but the researchers found the same consistent association between abstract shape and abstract sound regardless of language or culture.

‘When making associations like these, we instinctively find—or create—patterns. These patterns, in turn, connect the disparate sensory descriptions in synesthetic metaphors. These primal perceptual associations may be hardwired into our brains, since even very young children associate visual and auditory stimuli’

The ‘Bouba-Kiki Effect’ and the idea of synesthetic metaphors is central to a rapidly growing body of consumer research which is having profound effects on how we evaluate the subliminal experience of feel, colour, shape, and sound, in areas as diverse as packaging design, retail design, UX and UI. For example a recent study  drew attention to the subliminal impact of the diminutive sound of the letter ‘i’ in brand names in the grocery sector:

There are good grounds to believe that brands that are associated with small objects, and/or companies that are associated with low prices, would do well to include the (i) sound in their brand name. Indeed, this may be the reason why… the letter ‘i’ appears in so many of the names of those successful budget supermarket chains.

In which case, it could be argued that if a retailer like Tesco wanted to convince customers that their prices were as low as their discounter rivals, rather than wasting their time trying to overcome a subliminally held belief with a rational argument they might do better to simply change their name to ‘Tesci’.

THE ROLE OF SUBLIMINAL METAPHOR IN MOVIES

We can see the effects of synesthetic metaphors very clearly in the process of movie making.

When you sit in a darkened movie theatre, engrossed in the latest blockbuster, the vast majority of what you experience, happens below the threshold of consciousness.

The human brain, can process somewhere between 10 and 12 separate images per second. Each and every frame of a movie contains a vast amount of data, so with the average movie running at 24 frames per second there is simply too much information for the conscious mind to take in.

This means that your unconscious mind processes the incoming data millions of times faster than your conscious mind, and so gets to see a lot more of the movie than you do.

At this unconscious level of communication the basic unit of currency is the metaphor.

These metaphors can take many forms. In a movie, they could be anything from sounds and shapes to the use of a certain lens or camera, from music to lighting, or from the grading of an image, to the speed of editing.

There are literally millions of decisions that need to be made in terms of casting, performance, camera style, music, sound design, wardrobe, art direction and editing; each of which will ensure a slightly different experience when you sit down to watch the final movie. Taken together, these small decisions ensure that no two versions of a movie will ever be the same.

So for example, whilst the original ‘Psycho’ made in 1955 for just $800 thousand was an immediate box office hit earning more than $40 million, the 1988 remake, which cost $20 million, used the exact same script, and copied everything from the camera movements to the editing style, turned out to be a box office disaster.

The creators of the movie may not always be consciously aware of the metaphors that they are using, often making decisions simply because they intuitively ‘feel right’ for this character or that scene.

Bennett Miller’s 2005 Capote is a movie which features one such set of subliminal visual metaphors.

Set in 1959, the movie tells the true story of how Truman Capote came to write In Cold Blood. The book was based on the story of Perry Smith, the man responsible for the horrific murder of a family of four in Holcomb, Kansas. As the director explained later in an interview:

‘Capote’s feelings for Perry Smith are really fundamental to the movie. I think that Capote and Smith looked at each other and understood something about each other that nobody else understood. At the core, they were very similar… When Capote was asked a year after the book was published, What was it about Perry Smith that explained their intense relationship? Capote said, “Perry’s immense loneliness.”.’

When Capote meets Perry Smith for the first time he is already behind bars; this is where he remains, for the duration of the movie, until he is finally executed. But Capote is also a prisoner in his own world, and this theme of imprisonment, loneliness and freedom is reflected subliminally by a visual metaphor in which horizontals are used to express freedom and verticals are used to express imprisonment or confinement.

This visual metaphor may not have been a feature of the original script, and is something that few viewers may consciously register, but, because it speaks directly to the unconscious, it is fundamental part of how the movie makes you feel.

capotel
CREATIVITY IN ADVERTISING

What distinguishes advertising creativity from other forms is that it is applied creativity… in other words, it is creativity with a very particular purpose.

Whereas a novel or a movie is ‘simply’ required to provide a return on investment by engaging the largest audience possible, good advertising is not only required to do this, but also to change behavior and attitudes as well.

The constant drive to maximize efficiencies and reduce costs, however, means that within the advertising business there is constant pressure to produce work at the Low Bandwidth end of the creativity spectrum, (where the work tends to be template based, easy to produce and cheap).

This means that the vast majority of money spent on advertising is wasted because it is frittered away on Low Bandwidth Creativity communications which are unlikely to engage the audience at a subliminal emotional level, let alone change their behavior and attitudes.

The importance of High Bandwidth creativity within the marketing mix was recently demonstrated in a unique study by Les Binet and Peter Field on behalf of the IPA entitled ‘The Long and the Short of it: Balancing the short and long-term effects of marketing’.

This research drew on a vast amount of historical data in the UK (996 advertising effectiveness case studies, from 700 brands, across 83 sectors, spanning over 30 years) to show that the most effective marketing mix was to be found by combining the establishment of long term emotional brand associations – emotional priming as they call it – with short term rational product and pricing messages. As the study says:

So emotional priming has the benefit of amplifying the effects of activation messages designed to give consumers a reason to buy, at the time of purchase, and, by so doing, boosts short-term behavioural responses. This is the basis of the brand response effect…

In other words, investing in long term High Bandwidth communications will prime an audience to positively respond to more short term, Low Bandwidth rational messages. However, the study warns that the benefit of these High Bandwidth communications is easily overlooked :

Of course, this leads to some potentially highly misleading market research findings, for the unaware. Because, asked why they chose brand A, consumers will be unable to play back the emotional priming that has influenced them over the long term; instead, they will play back the rational activation messages that are more easily accessible to their conscious thought… Market research therefore has a dangerous tendency to underplay the importance of long-term emotional priming and to exaggerate the importance of short-term ‘news’.

APPLE ‘THINK DIFFERENT’: WHAT HAPPENS WHEN A BRAND INVESTS IN HIGH BANDWIDTH CREATIVITY.

Today Apple is the most valuable company in the world, worth over $700 billion, but twenty years ago when Steve Jobs decided to rejoin the organization that had previously forced him out into the wilderness, the company was on life support and the brand was on its deathbed.

With no guarantees that the company would survive, what Jobs urgently needed to turn things around was the breathing space to developing innovative new products.

He was given this breathing space by an advertising campaign, the result of which was the most remarkable corporate resurrection of our time or perhaps of any time.

But the process was anything but easy. Jobs called in the guys he had known at Chiat Day, the agency that had created “1984”, the famous Ridley Scott directed ad that had introduced the original Macintosh computer.

From the start it was clear that there was a clash of cultures. Steve Jobs represented the world of Technical Creativity, a culture of entrepreneurs and engineers whereas the Chiat/Day guys represented the world of Expressive Creativity a culture of writers, art directors, designers and cinematographers.
Jobs explained that Apple was “hemorrhaging” and the company was in worse shape than he had imagined and he needed to do some advertising. As Jobs put it: ‘I’m thinking no TV ads, just some print ads in the computer magazines until we get things figured out.’  Rob Siltanen the creative director from TBWA Chiat/Day was amazed at Jobs’s arrogance and told him straight:

‘Half the world thinks Apple is going to die. A few print ads in the computer magazines aren’t going to do anything for you… Nobody stands around the water cooler talking about print ads. You need to do something bigger and bolder.’

And ‘bigger and bolder’ was what the agency produced. ‘Think Different’ was the work of a young art director called Craig Tanimoto and creative director Rob Siltanen. It was a High Bandwidth advertising campaign, designed to contrast with the rather uninspiring Low Bandwidth campaign that IBM were running at the time: ‘Think IBM’.

But to call ‘Think Different’ simply an advertising campaign is to do it a massive disservice. Indeed, more than a communications platform, ‘Think

Different’ was a creative brand platform, that would facilitate both communication and product innovation, both Expressive Creativity and Technical Creativity.

The words that Rob Siltanen wrote for the TV commercial read like a manifesto, a call to arms and a salute to artists and inventors alike, words that are as true to the brand today as they were almost twenty years ago:

Here’s to the crazy ones.
Here’s to the misfits. The rebels. The troublemakers.
Here’s to the ones who see the world differently.
They’re the ones who invent and imagine and create.
They’re the ones who push the human race forward.
While some may see them as the crazy ones, we see genius.
Because the people who are crazy enough to believe they can change the world are the ones who actually do.
Apple. Think different.

Despite having no new products, the ‘Think Different’ advertising campaign gave the Apple brand a much-needed boost. Within 12 months, Apple’s stock price had tripled. And Steve Jobs got the breathing space he required. One year after the ‘Think Different’ launch, Apple introduced their multi-colored iMacs. The computers represented a revolutionary design philosophy – the physical embodiment of the ‘Think Different’ idea – and went on to become some of the best-selling computers in history… paving the way for iTunes, the iPod, the iPad, the iPhone, and all the other product development that would power Apple to become the $700 billion company that it is today.

Think Different Poster Miles

The extent of Apple’s achievement was demonstrated ten years later in a remarkable paper called ‘The Automatic Effects of Brand Exposure on Motivated Behavior: How Apple Makes You ‘Think Different’.

This research found that when people were subliminally exposed to either an IBM or an Apple logo, and then asked to complete a task designed to evaluate how creative they were  – by thinking up different uses for a brick – those who had been exposed to the Apple logo consistently behaved more creatively than those who had been shown the IBM logo. As one of the researchers, Gráinne Fitzsimons, put it:

‘this is the first clear evidence that subliminal brand exposures can cause people to act in very specific ways.’

A bit like the chap who maintains that he doesn’t drink Guinness because of the advertising, but purely ‘because it’s good for you’, the reason that so many people thought differently after being subliminally exposed to an Apple logo, was largely due to Apple’s ‘Think Different’ campaign.

CONCLUSION

As we have seen the global creative economy is estimated as worth more than US$ 624 billion and growing at more than 8 per cent every year.

At the heart of this economic miracle, however, is a question that needs to be answered, which is: ‘what is creativity and how does it work?’

There are no easy answers to this question, but I believe that the methodology I have described above goes a long way towards what is a scientifically robust, but also a highly practicable answer.

The separation of Technical and Expressive creativity is useful because it helps us to define and clarify two very separate cultural areas, whilst allowing us to plot them as two coordinates on a matrix, along with what we have called Creative Bandwidth.

As we have seen, when combined with the latest scientific research into the nature of subliminal communication, this structure provides many insights into the creative process that allow us for the first time to begin to create a complete taxonomy of creative ideas.
When this thinking is applied to advertising, in particular, it demonstrates the supreme importance of emotional engagement, and the value of addressing people’s unconscious as well as their conscious minds.

We are currently witnessing the rise of machines that can run algorithms that will easily outperform the 40-bit processor that is the conscious human mind. We have yet to see, however, a machine that can replicate the unconscious.

All human beings are capable of creative thought; and theoretically speaking, it is true that an idea can come from anywhere.
In practice, however as we have seen, a great idea will only come from an individual who has first spent time considering every aspect of the problem at hand, and then invested yet more time processing it unconsciously.

Only individual human creativity can do this in a way that communicates fully with an audience at the deepest, most emotional level.
It is estimated that the human brain contains around a hundred billion neurons; which means that the possible number of pathways for a signal – or thought – to take at any moment in time is greater than the number of atoms in the known universe.

When the conscious and unconscious minds are both fully engaged in imagining new ideas there is, therefore, no limit to what we can think, feel and communicate. Here be dragons indeed.

Categories
Artificial Intelligence Bandwidth of Consciousness Communication Information Theory Narrative Neuroscience The Conscious Mind The Creative Process The Nonconscious Mind

Protected: Welcoming Hamlet’s Ghost

This content is password protected. To view it please enter your password below:

Categories
Creativity

Marshall McLuhan: Creativity in a world without boundaries

mansaram-mcluhan cropIn 1996, Gary Wolf, a writer for Wired Magazine, noticed that someone calling himself Marshall McLuhan was posting comments on the Wired website.

This struck Gary as more than just a little curious, since Marshall McLuhan had been dead for more than sixteen years.

Not one to be put off by such details, and sensing one hell of a story, Gary emailed the deceased media guru and asked him to do an interview.

Marshall McLuhan agreed, and the highly unusual exchange that followed, was published in Wired Magazine (you can read it here).

When questioned about the experience, Gary concluded, “If the author was not McLuhan himself, it was a bot programmed with an eerie command of McLuhan’s life and inimitable perspective.”

Now, whether or not you believe in Marshall McLuhan’s ability to conduct interviews from beyond the grave, the fact is, he was never a man to be limited by boundaries.

Unlike most Western knowledge.

For the last five hundred years, since the invention of the printing press, Western culture has divided human knowledge into a number of separate, discrete silos.

So for example, if you want to find a book in your local library, whether you realise it or not, you would be finding your way around by using something called the Dewey decimal classification system.

This ingenious little system was introduced by Melvil Dewey in 1876, and makes it really easy to find any book in the world by its subject matter. In order to do this, all human knowledge is divided into ten broad areas:

000 – Generalities

100 – Philosophy and Psychology

200 – Religion

300 – Social Sciences

400 – Language

500 – Natural Science and Mathematics

600 – Technology (Including Applied Sciences)

700 – The Arts

800 – Literature and Rhetoric

900 – History and Geography

These ten areas are then each divided into ten divisions, each having ten subsections. The structure is hierarchical and the numerical convention follows this structure. So each subdivision gives increasingly specific subjects within a broader subject area, for example:

500 Natural Science and Mathematics

510 Mathematics

516 Geometry

516.3 Analytic geometries

516.37 Metric differential geometries

516.375 Finsler Geometry

The great strength of the Dewey System is that it allows the reader to find any given subject and drill down into it to discover more and more specialized levels, within that subject.

But that is also its weakness.

Because whilst you can explore data vertically as much as you want, you cannot explore data horizontally.

So you cannot, for example, move easily from, say Neuroscience to Advertising. And a book that has the audacity to cover both will be forced to choose one or the other.

The Dewey System is a perfect model of the way that we in the West, have interacted with information, since the invention of printing.

And this creates extraordinary limitations on the way that we do our research, the way that we make our scientific discoveries and the way that we educate our children.

But all of this was to change utterly with the advent of the digital age.

50 years before anybody ever updated his Facebook page or posted his whereabouts on Twitter Marshall McLuhan not only predicted the creation of the internet, he also predicted most of it’s defining characteristics.

And he coined the phrase “surfing” to describe the way we would all navigate our way in a non-linear fashion through the sum total of human knowledge.

medium-is-massage_i3
A double page spread from “The Medium is the Massage: An Inventory of Effects” (1967). Fifty years ago Marshall McLuhan had not only predicted the creation of the internet, he had also predicted most of it’s defining characteristics… including the phrase “surfing” to describe the way we would be able to navigate our way, in a non-linear fashion, through the sum total of human knowledge.

In this way, all human knowledge suddenly becomes interconnected in a ways that were previously inconceivable.

And there is no better expression of this world without boundaries than the “Mashup”.

Mashups are perhaps the defining characteristic of late 20th and early 21st century culture. Whether it’s music, video, literature, or software a mashup combines material from two or more sources to create something that is simultaneously 100% derivative and 100% original.

Mashups work by linking two separate cultural expressions, and seeing how they inform and influence each other.

For sheer comedy value the farther apart the two elements are culturally, the funnier the result.

Try this simple thought experiment: think of a subject, any subject let’s say for example, Death Metal.

Think of another as far removed from that as possible, say, monks of the Benedictine order.

Now put them together and Hey Presto… you’ve either got the basis of a new TV comedy (Benedictine monks form Death Metal Band… with hilarious consequences) or the next novelty music hit (Death Metal meets Gregorian chant.)

1ff71e69c5dfd61896d4ef048d45ec41 copy
Mashups are perhaps the defining characteristic of late 20th and early 21st century culture. They work by linking two separate cultural expressions, and seeing how they inform and influence each other. For sheer comedy value the farther apart the two elements are culturally, the funnier the result.

In many ways, mashup techniques have become the default methodology for creativity in the digital age because they work so consistently well.

In fact, no lesser an author than J.K Rowling used the mashup technique, when she took the narrative conventions of the genre commonly known as “Sword and Sorcery” and mashed this up with that most beloved genre of English children’s literature… the boarding school novel. (A genre made famous by Anthony Buckeridge’s Jennings, and Enid Blyton’s Malory Towers amongst others). And created the most successful children’s books of all time… the Harry Potter novels.

HP COMP
J.K Rowling created a classic mashup, when she took the narrative conventions of “Sword and Sorcery” and mashed this up with that most beloved genre of English children’s literature… the boarding school novel.

But for McLuhan, the instantaneous nature of electronic media had implications far beyond the cultural mashup.

McLuhan had shot to international stardom in the 1960s with radical ideas about the effects of media on human consciousness.

For McLuhan, the introduction of any new medium, whether it is the invention of alphabetic writing, the printing press, or television will always affect our central nervous system by becoming an extension of one or more of the five human senses.

Thus the introduction of any new medium has the effect of distorting the way in which we perceive reality.

So every time there is a significant new development in media, there will also be an equally significant impact on human consciousness.

For McLuhan, the most significant effect of electronic media was to dissolve the traditional barriers that segmented knowledge into separate compartments. And in particular the eradication of our traditional notions of “Time” and “Space”.

This “allatonceness” – as McLuhan terms it – created by digital media allows us to connect across the complete range of human knowledge and experience. Instantaneously dissolving the divisions between, say, language and mathematics, art and advertising, opera and pop music, distance and proximity and the living and the dead.

In “The Medium is the Massage: An Inventory of Effects” (1967), a book he co-created with graphic designer Quentin Fiore. Marshall McLuhan describes the dramatic impact that is being brought about by the arrival of electronic media:

“Societies have always been shaped more by the nature of the media by which men communicate than by the content of the communication…

The alphabet and print technology fostered and encouraged a fragmenting process, a process of specialism and of detachment. Electric technology fosters and encourages unification and involvement…

Ours is a brand-new world of allatonceness. ‘Time’ has ceased, ‘space’ has vanished. We now live in a global village…a simultaneous happening.

Electric circuitry profoundly involves men with one another. Information pours upon us, instantaneously and continuously. As soon as information is acquired, it is very rapidly replaced by still newer information.

Our electrically configured world has forced us to move from the habit of data classification to the mode of pattern recognition. We can no longer build serially, block-by-block, step-by-step, because instant communication insures that all factors of the environment and of experience co-exist in a state of active interplay.”

And yet these ideas about the effects of media on human consciousness, although radical were not entirely original.

McLuhan was particularly indebted to Harold Innis, professor of political economy at the University of Toronto and the author of a number of seminal works on media, and communication theory.

McLuhan was indebted to Innis not just for giving the young McLuhan a framework for his ideas about media, but also for giving him permission to ignore cultural boundaries in his search for a greater understanding of the effects of media on culture and consciousness:

“I remember the excitement I felt when I first realized I didn’t have to restrict my studies to literature. Innis taught me that I could roam through all history and all subjects in search of the true meaning of the medium is the message.

My friend… who teaches economics at Toronto University tells me that F. von Hayek says, “Nobody can be a great economist who is only an economist – and I am even tempted to add that the economist who is only an economist is likely to become a nuisance if not a positive danger.”

Likewise, no student of media studies can afford to be only a student of media studies. A man who only reads about TV is as good for a man as a steady diet of coke and chips.”

Ignoring all the usual boundaries between academia and popular culture, contemporary creativity and ancient literature, McLuhan became the most extraordinary synthesizer of the ideas of others.

From the Ukrainian Scientist, Vladimir Ivanovich Vernadsky , and French Jesuit philosopher, Teillhard De Chardin he took the idea of the “Noosphere”, as the basis for his notion of the “The Global Village”. Walter Ong’s account of what he calls the “oral-to-visual” shift was, in his own words, “hammered out with great agony” in his 1958 book “Ramus, Method, and the Decay of Dialogue”. And was to greatly influence McLuhan’s first major publication, “The Mechanical Bride”.

But, without doubt, the greatest influence of all on Marshall McLuhan’s were neither philosophers nor media theorists… but a little known revolutionary art movement, which had appeared on the eve of the first World War in Britain.

This movement was called “Vorticsm” and although it had little impact on the world at the time, the men behind it, Ezra Pound, Wyndham Lewis, James Joyce and T.S Eliot would go on to shake the culture of the English speaking world to it’s very foundations.

These “Men of 1914”, as McLuhan was fond of calling them, set out do destroy many of the artificial boundaries that separated high art from low art, and would both define the key characteristics of the Modernist world, and help bring it into being.

Through their influence on McLuhan, however, they would also prefigure and define the key characteristics of the digital age (including the invention of the cultural Mashup). And help bring the digital age into being.

Next: Allatonceness: McLuhan and the Men of 1914