It is a function of our narcissism that we always imagine that the technologies we create will reimagine our humanity in significant ways.
Of course, technological innovations have shaped our history. In the 15th century, a goldsmith called Johannes Gutenberg invented the printing press and the effect was revolutionary. By the end of the century there were more than 13 million books in circulation and that rose to more than 138 million between 1551 and 1600. Those European cities that harnessed this new technology grew 60% faster than those that did not and the printing press accounted for as much as 80% of the reason for that growth.
Today, we are in the midst of new information revolutions. What makes our digital age distinct is both the speed with which information can be transmitted and the extraordinary pace at which one iteration of our digital world yields to another. As Walter Isaacson, the biographer of Steve Jobs, explained: “History is about change—why it happens, who causes it, and who resists it. I want to connect history to the digital revolution we’re going through so people better understand the forces of change.”
Defining, let alone understanding, those “forces of change” is an immense cognitive undertaking, for which our brains are ill-equipped. 2023 is the 40th anniversary of the invention of the Internet when APRANET computers switched to the TCP/IP protocol. Two years later the first virtual communities were born and in 1990 Tim Berners-Lee created the World Wide Web and gifted it to humanity so that we might all be connected as a global community. In 1998 Google was created and Bill Clinton’s relationship with Monica Lewinsky became the first news story to break online. Between 2004 and 2007, Facebook, YouTube, Twitter and the iPhone were all launched and today the Metaverse, Web3 and decentralised blockchain technologies will drive far-reaching, incalculable changes in the way we interact.
If we are to rise to Isaacson’s challenge and harness these ”forces for change”, we can only only do so by understanding the one thing that remains constant: human psychology. 2023 may see the inexorable advance of new digital worlds but the brains that have to navigate them are about 50,000 years old and hardwired for irrational, biased decision making.
Let’s take a look at how these brains work in a commercial setting. The first observation we can make is that they don’t work well with data. It may have Josef Stailn or a German satirist called Kurt Tucholsky who said ‘one death is a tragedy, a million deaths is a statistic’ but the observation is accurate: we work better with small numbers than large numbers, which is why the story of an individual life is more likely to move our emotions.
However, the situation gets more complicated when we observe, (and the psychological literature is conclusive about this fact), that we are drawn to variety and complexity even as it overwhelms our ability to think and inhibits our capacity to act. A UK software company recently told the team at Cognition about how they won a pitch to get a large corporate account in the pharmaceutical sector. The corporate procurement team had listed more than 40 software providers and given them a list of over 150 items the software needed to deliver. After numerous rounds of meetings and a decision still not made, the pharmaceutical company found the UK software company, met them on Zoom and hired them after two meetings despite the fact that the company was part of the original pitch and could not deliver all the 150+ items deemed as necessary.
This happened because the pitch process was flawed. To evaluate 40+ companies on an impossibly long list of requirements meant that the procurement team’s decision making apparatus got overwhelmed. Paralysed by the complexity their approach had generated, the procurement team chose a provider that stood in a pitch list of one (i.e. they were not part of the original pitch list and therefore, from a cognitive perspective, they stood apart from it) making a decision possible. Despite all the apparent rationality of the process, the decision was ultimately driven by cognitive simplicity and personal chemistry.
We can conclude, as psychologists have for decades, that our decision making is irrational. This was proven by split brain experiments (i.e. where the corpus callosum connecting the two hemispheres of the brain allowing each hemisphere to think independently of the other) conducted by neuroscientists Michael Gazzaniga and Roger Sperry, about which the cognitive scientist Steven Pinker concluded: ‘The conscious mind - the self or soul - is a spin doctor, not the commander in chief’. The psychologist, Jonathan Haidt came to the same conclusion, writing that our rational mind ‘thinks of itself as the Oval Office when actually it’s the press office’.
For marketing communications and consumer decision-making, these irrational decision strategies mean we should use data judiciously to understand markets and personas before storifying that data to make it commercially effective. This observation applies as much to the metaverse and blockchain as it did to the printing press. Rapidly evolving technologies have to reconcile technological progress with an (almost) ossified cognitive architecture.
As virtual worlds envelop the ‘real’ one and ‘real’ people are replaced by digital avatars, as our high streets change bricks and mortar into lines of code, human psychology remains constant. Soon every agency will be working with Web3, the metaverse and blockchain. Those that succeed in delivering value to their clients will remember that getting the best out of new technologies will depend on our capacity to leverage the irrational decision strategies of the human beings that bring the digital currencies to spend it on NFTs in our virtual stores.
To find out more about how our Scientific Board and expertise in the integration of psychology, neuroscience and marketing can help any business anywhere in the world improve brand performance, contact us today.