If you could upgrade your brain, what would you change?
In my forthcoming novel, an alien virus infects humanity, with an effect rather like uploading a new software version to our brains. Memory and personality are retained – we are still ourselves – but some of our capabilities are upgraded. The virus is benign: but what should it improve, to optimise us?
I would not rush for higher IQ, which is a two-edged sword. Even when not socially dysfunctional, people with MENSA level IQs walk a tightrope between being thought arrogant and wearily helping others catch up. Besides, there was a good reason for humans to evolve with varied attributes. Any upgrade must preserve variety, or lose a key advantage of the human tribe.
There are of course many types (and definitions) of intelligence. Which would you boost in yourself? Empathy and social intelligence? Sporting intelligence and the rewards it brings? Artistic intelligence and the resulting creativity?
Or would it be intelligence at all? Perhaps you’d prefer to improve some other aspect of your mental capability, like memory (but would you actually want total recall?) Would you do a Solomon and choose wisdom? Even my clever aliens would find it tough to deliver that. How about greater sensory perception? In his book Life 3.0, Max Tegmark suggests that AI-enhanced brains could vastly broaden our experiences by processing data from sensors covering e.g. more of the electro-magnetic spectrum.
In fact the electronic upgrading of our brains has already begun. Elon Musk has made characteristically bold pronouncements, on using electronic implants not only to help repair brain damage but also for communicating wirelessly without speech. This raises the prospect of human interactions becoming universal, with inbuilt language translation programs.
Any development of telepathy raises profound questions about how far we want to go. I am too individualistic to willingly merge into a hive mind. But a cloud mind, perhaps… provided we can choose what to upload. My best thoughts I want to share with the world; my worst I want to delete unseen.
Yet my alien upgrades target something different. By enhancing our ability for objective self-criticism, they improve our competence and give us a higher level of self-awareness. We need this to combat the Dunning-Kruger effect, whereby people with low ability in a task think they’re doing great.
In contrast, competent people are stern critics of their own work. That’s such a key quality for a writer that I’m probably programmed to pick it out. Even more than criticism from others, it enables writers to improve: and it underpins good and bad performance in everything from driving a car to being president.
So my choice for an upgrade is self-awareness. It’s an ability that cuts across all others, preserving the variety of the human tribe in all other respects and boosting performance in most of them.
That an alien virus might do this for us is wishful thinking: but AI may get there before we do. The ultimate choice we face may be about upgrading ourselves into a hybrid form, before autonomous AI outstrips us.
If we go hybrid, how do we avoid a split between those who want to upgrade and those who don’t? In my novel, that leads to civil war. Obviously it’s analogous to what is happening right now, in the growing war between those who revel in their ignorance and those who recognise their faults and wish to improve. In the book, The New Enlightenment wins. In reality, it hangs in the balance.