Let’s face it, with a gloriously daft plot that has Raquel Welch running around in a fur bikini and cave men doing battle with dinosaurs, the movie “One Million Years B.C.” was never going to win any prizes for historical accuracy.
However the fact that it wasn’t completely laughed out of the cinema when it was released in 1966, illustrates a fascinating cultural bias…
That popular culture tends to see history as little more than an extended costume drama.
We tend to see historical figures as being physically the same as us… in both mind and body. Admittedly, we recognize that these fellows may be a bit behind with technology, and that they may have some curious superstitions and beliefs, not to mention some dubious personal hygiene habits. But, given a good bath, a few lessons in English, and of course, a change of clothes we imagine that most historical characters could be introduced to contemporary society with ease.
This is largely because, when we have given it any thought at all, we have always tended to see the adult human brain as something fixed and unchangeable.
And if changes do take place to the structure of the human brain, that they tend to take place, through natural selection, over the course of many generations and many millennia.
Not everyone thinks this way, however. Back in the 1976, Julian Jaynes, a professor of psychology at Princeton, published a revolutionary new approach to the history of the mind in The Origin of Consciousness in the Breakdown of the Bicameral Mind.
In this extraordinary book, Jaynes argued that consciousness was, in fact, only a fairly recent development in human evolution, emerging as late as 10,000 years BCE.
Before this time, Jaynes argues, people experienced the world rather like schizophrenics who experience “command hallucinations”, or “voices” that tell them what to do. (In fact, according to Jaynes, schizophrenia is simply a vestige of humanity’s earlier pre-conscious state.)
Jaynes called this cognitive state “Bi-Cameralism” reasoning that the instructions or “voices” came from the right brain counterparts of the left brain language centres—specifically, Wernicke’s area and Broca’s area. These regions are somewhat dormant in the right brains of most modern humans, but occasionally auditory hallucinations have been seen to correspond to increased activity in these areas.
Using the earliest writings as evidence for his theory, Jaynes showed that the characters described in the Iliad, do not exhibit the kind of self-awareness we normally associate with consciousness. Rather, these bicameral minded individuals are guided by mental commands issued by external “gods”.
And whilst, no mention is made of any kind of introspection in the Iliad and the older books of the Old Testament, the corresponding works written after 10,000 B.C.E. like the later books of the Old Testament or the later Homeric work, the Odyssey, show signs of a very different kind of mentality … an early form of consciousness.
According to Jaynes, human consciousness first emerged as a neurological adaptation to developing social complexity, as the bicameral mind began to break down during the social chaos of the The Bronze Age Collapse” in the second millennium BCE.
Historians are unclear as to the cause of “The Bronze Age Collapse”, but between 1206 and 1150 BCE, the cultures of the Mycenaean kingdoms, the Hittite Empire, and the New Kingdom of Egypt collapsed, and almost every city between Pylos and Gaza were all violently destroyed, and largely abandoned.
This was a time of large scale economic collapse, and mass migrations across the region (Christian and Hebrew scholars associate Moses and the story of Exodus with this time), creating social stresses that required societies to intermingle, forcing people to learn new languages and customs and generally, to become more flexible and creative.
Jaynes argues that self-awareness, or human consciousness, was the culturally evolved solution to this problem. Jaynes further suggests that the concepts of prophecy and prayer arose during this breakdown period, as people attempted to summon instructions from the “gods” and the Biblical prophets bemoaned the fact that they no longer heard directly from their one, true god.
According to Jaynes, relics of the bicameral mind can still be seen today in cultural phenomena like religion, schizophrenia and the persistent need amongst human beings for external authority in decision-making.
Jaynes big idea is really quite breathtaking in its boldness. And although many of his other, related, ideas have since been validated by modern brain imaging techniques, Jaynes’s Bi-Cameral hypothesis remains highly controversial.
At the time of publication, one of Jaynes’ more enthusiastic suporters was Marshall MCcluhan, who was undoubtedly drawn to Jaynes ideas about the origins of consciousness and how it shed light on the origins of language and writing. (It is entirely possible that the Bronze Age Collapse was the result of the collapse in early media, and breakdown of the Bronze Age mind , rather than the cause of it.)
Marshall McCluhan is, of course, famous for his thinking on the impact of media on consciousness.
McCluhan and onetime collaborator Walter Ong were both fascinated by how the shift from an oral-based stage of consciousness to one dominated by writing and print changed the way humans think.
McCluhan went on to expand on these ideas in The Gutenberg Galaxy, particularly the significance of the invention of moveable type and printing in terms of it’s impact on human consciousness.
It has been said that the digital age we are now entering is greater than any of these previous information revolutions, in that its impact may be no less than the equivalent of the invention of writing and the invention of the printing press… all at the same time.
However, until recently we have been lacking proof that even changes of this magnitude can have an immediate effect on human consciousness,
But over the last few years, new research using MRI scanners, has revealed that the adult human brain can, in fact, be significantly transformed by experience.
“Neuroplasticity” is the term used to describe this new found ability of the brain to transform itself structurally and functionally as a result of it’s environment. Significantly, the most widely recognized forms of “Neuroplasticity” are related to learning and memory.
If you live in central London you will be very familiar with the sight of guys, looking a bit lost, as they ride around on mopeds with a clipboard attached to the handlebars.
These are would-be London taxi drivers doing what is known locally as “The Knowledge”.
In order to drive a traditional black London cab, these drivers must first pass a rigorous exam which requires a thorough knowledge of every street in a six-mile radius of Charing Cross. It usually takes around three years to do “The Knowledge”, and memorise this vast labyrinth of streets in central London, and on average, only a quarter of those who start the course manage to complete it. However, it now appears that those who manage to attain “The Knowledge” , find themselves not only with a new job, but also with a modified brain.
A study published in 2000 by a researcher team at the Department of Imaging Neuroscience, at University College London demonstrated that London Taxi Drivers who do “The Knowledge” develop a larger, modified hippocampus, that part of the brain associated with memory and navigation, than they had previously.
As Dr Eleanor Maguire, who led the research team put it, “There seems to be a definite relationship between the navigating they do as a taxi driver and the brain changes. The hippocampus has changed its structure to accommodate their huge amount of navigating experience. This is very interesting because we now see there can be structural changes in healthy human brains.”
Over the past decade, other researchers, using MRI scanners, have observed similar structural and functional changes to the brain. For example, this study (Temporal and Spatial Dynamics of Brain Structure Changes during Extensive Learning Draganski, et al., 2006). shows how medical students undergo significant changes to their brains whilst studying for exams.
If this type of transformation can be observed in the adult brain, imagine what changes might be ocurring in the minds of children growing up in the midst of the digital revolution.
In 2001, the American educationalist, Mark Prensky, drew everyone’s attention to the impact this was beginning to have on the way our brains are wired, when he published this article called “Digital Natives, Digital Immigrants”.
As Prensky points out: “Our students have changed radically. Today’s students are no longer the people our educational system was designed to teach. Today‟s students have not just changed incrementally from those of the past, nor simply changed their slang, clothes, body adornments, or styles, as has happened between generations previously.
A really big discontinuity has taken place. One might even call it a “singularity” – an event which changes things so fundamentally that there is absolutely no going back.This so-called “singularity” is the arrival and rapid dissemination of digital technology in the last decades of the 20th century. It is now clear that as a result of this ubiquitous environment and the sheer volume of their interaction with it, today‟s students think and process information fundamentally differently from their predecessors.
These differences go far further and deeper than most educators suspect or realize… it is very likely that our students’ brains have physically changed – and are different from ours – as a result of how they grew up….
Digital Natives are used to receiving information really fast. They like to parallel process and multi-task. They prefer their graphics before their text rather than the opposite. They prefer random access (like hypertext). They function best when networked. They thrive on instant gratification and frequent rewards. They prefer games to “serious” work.”
These points that Prensky makes about the changing nature of consciousness are further developed in “Communities Dominate Brands” published in 2005 by Thomi Ahonen and Alan Moore. Here, Ahonen and Moore, identify two key drivers behind these changes:
Firstly they identify the act of gaming as the key factor in the rewiring of the Digital Native’s neural circuitry. Since the mid 1990s children have been playing games on electronic devices, rather than as the previous generation, simply being passive consumers of broadcast media. And it is the interactive nature of electronic gaming, according to Ahonen and Moore that that is responsible for changing the brains of this generation structurally and functionally, and creating a constant appetite for social interactivity.
Secondly, Ahonen and Moore, emphasise the difference between the world of the PC and the fixed internet to the world of the mobile device and the mobile internet. The former is a world where you “log on” and “log off”, the latter a world where you are constantly connected. They characterize the former as “The Networked Age” (1992-2004) and the latter as “The Connected Age” (2002-2014?). And where the key characteristic of the “The Networked Age” is “Acquiring”, the key characteristic of the “The Connected Age” is “Sharing”.
It is this constant connection, and desire to share and interact with other engaged minds that Ahonen and Moore predicted would create new “elective”, dynamic communities, and that these new social groupings would be the engine of massive social change. The subsequent rise in social media and it’s impact on major social events like the events of “The Arab Spring” are testament to the validity of Ahonen and Moore’s predictions.
However, it is probably the world of gaming that has the most to teach us about the changes that are taking in place in wiring of the brains of our “Digital Natives. And we will explore this in a little more detail in part two.