President Obama learned of the major events in this week’s unfolding revolution in Egypt not from his coterie of highly-paid advisors, but from cable news channels.
Earlier in the week, the CIA’s head, Leon Panetta, said that intelligence reports backed his predictions about the outcome of the revolution. Close aides later admitted that he too was relying on media reports.
All the while, media relied heavily on social networking services – especially at the start of the revolution, before their news crews descended on Tahrir Square.
In an age where information is instantly transmitted via a global stream-of-consciousness, some people in high places are starting to ask: is there still a central role for field agents?
And how can we experts at home be expected to produce quality analysis of unfolding situations when data is pouring in faster than it can be collated?
On the macro level, all this provides, in the minds of some, another excuse to say, ‘move over, the machines are coming.’ Communications technologies are no longer simply reporting world events, they are shaping them.
Wasn’t the Egyptian Revolution driven by Twitter and Facebook, they say? Isn’t it true that in the era of increasingly sophisticated Artificial Intelligence programmes, only machines can be expected to collate such huge bodies of data? Aren’t machines our best hope of receiving fast analyses, presenting our leaders with the best options for response?
Technologists have long recognized that machines will become less and less peripheral to global events, as they have already done in our individual lives. We may rely on our iPhones and iPads now, but this is nothing to what’s ahead as people become ever more reliant on intelligent (or pseudo-intelligent) machines.
Many technologists now look forward, with almost Messianic zeal, to a moment in history they’ve christened the Singularity Event.
Singularity refers to the moment when machines take on such a level of sophistication that human beings can no longer either understand or predict their true capabilities. It signifies, say the buffs, a time when machines and humans will play on a truly level playing field.
Carbon life-forms will no longer have the natural advantage over their silicon counterparts.
Researchers and developers at the leading edge of technology no longer see Singularity as a mere possibility – they look forward to it as a dead-set certainty.
Ray Kurzweil, a leading American authority on AI, expects that science will have reverse engineered the human brain by the mid-2020s. Computers, he says, will match human intelligence by 2030. Kurzweil also predicts that Singularity will be achieved by 2045 – within the lifetime of most of the people who will read this article.
Worldwide, a Singularity movement has emerged, promoting the aspirations of a machine-human interface where bio-chips and the like augment human capabilities. Techno-evangelists like Kurzeil speak of implanting computers into human brains, merging human beings with machine components.
Is all this just science fiction? Perhaps not. This week, a talking supercomputer with the rather innocuous name of Watson, will take on two human champions in the US quiz show Jeopardy.
Watson, built by IBM and named after its founder, Thomas Watson, has been programmed with more than 200 million pages of information and can review its entire content within three seconds.
Even more impressive is the fact that it can simultaneously calculate the probability of giving a wrong answer. It ‘knows’ when to hold back and when to take a risk, in a way previously thought to be the sole preserve of human beings.
Just a few years ago, Sony struggled to make a robot walk. Now, it is investing in robot anthropology, in the expectation that robots will play a central role in the emerging brotherhood of man and machine.
A team of Sony computer experts in Paris is working closely with British anthropologists to develop robots that can teach each other new words. The project seeks to emulate the human practice of learning the names of object and then passing them on to other robots.
‘Conversations’ will be recorded, so that science can learn how robots will dialogue in the near future. Perhaps in time they will form their own communities and urban enclaves (perhaps ‘roburbs’).
Already, leading Japanese elderly care homes are choosing Paro therapeutic robots, designed to respond to people who suffer from loneliness and memory loss, to replace human nurses.
Yes, machines will continue to play a greater role in our affairs. Yet for all its sophistication as a logic and probability machine, will a machine like Watson ever be able to appreciate the significance of something like Valentine’s Day?
Is the future of humanity really about Clouds you can’t see and chips you can’t eat? Or is it, as it has always been, primarily about human feelings and ideas?
The growing role that communications technologies play in shaping world events and the speed of research into human-machine hybrids, both raise important questions.
As we invite technology more and more into our bodies, where will we draw the line between what is human and what is machine? Is the distinction even important any more?
Will new genetic techniques result in the patenting, or at least selective sharing, of DNA patterns?
Many ethicists argue that a patent should only be awarded for the development of totally new product and not simply for the discovery of a natural process. Some researchers counter that all new techniques involve using existing processes, even natural ones, in new ways, so what’s to prevent someone using DNA in a new way and protecting that procedure?
The motive in asking these questions is not winding back the clock. It is preventing progress from becoming progressivism, where achieving pragmatic results in isolated, individual processes is the only measure of success.
Perhaps the question that most demands an answer is this: will having machines to empower us make us any more moral in our actions?
IBM has high hopes for Watson. With access to every science journal ever written, it says, perhaps doctors in the Third World might diagnose disease much more quickly. The question remains, though, whether First World drug companies would be any more forthcoming when it comes to sharing their products with those who can’t pay?
At the end of the day, all the talk of Singularity, machine intelligence and the like fails to take into account one important fact. The future of humanity is not a product of technology alone.
Richard Watson, author and futurist, rightly says that ‘technology is not destiny’. Our future, both individually and collectively, is primarily a product of our responses to technology and events, which are driven largely by emotion.
Yes, social networking made the Egyptian Revolution possible. It amplified the sense people had of injustices done by their government, by revealing just how many other people felt the same way.
The sense that their feelings were shared by millions of others gave the downtrodden Egyptians – 40 percent of whom live below the poverty line – confidence to stare down an entrenched regime.
This revolution, like the one in Tunisia before it, was still an essentially human event. The masses gathered in Tahrir Square were in effect a magnified example of ‘flash mobbing’, with technology making it possible for people to come together both emotionally and on a physical plain.
Mubarak and his cronies did not resign in response to a technological threat. They reacted to an unstoppable tsunami of human emotion. It was not the technology that changed Egypt; it was her people, who were driven by powerful feelings and aspirations.
Technology works best when it facilitates human hope, activism, engagement and intervention. It is most valuable when acts as a conduit for people’s thoughts, ideas and aspirations and facilitates a platform for interaction.
For ideas to connect, people must connect. Digital technology is brilliant at helping us share ourselves and our ideas.
To look forward to the day when Singularity overcomes or negates human emotion, is a false hope. Even if a computer like Watson could properly appreciate Valentine’s Day, it should still serve as a facilitator for relationships of the truly human kind.