The intelligence of people remains roughly the same, the computing capacity of computers doubles every two years. How can we ensure that people can hitch a ride with Moore's Law?
Computers are surpassing humans in more and more areas
We often envy computers, and for good reason. Computers can calculate and remember information much faster and more accurately than we can. Even in works in which we were once better than computers, such as face recognition and pattern recognition, we are now massively surpassed by computers. In what is seen as a turning point by many, including this writer, is the crushing of two human Jeopardy! champions by the IBM supercomputer Watson. Jeopardy! is an American game show, in which participants have to answer ambiguous questions. Watson proved to be a master in this, not only in knowledge of facts but also in interpreting ambiguous language. The technology behind Watson continues to improve, with the result that Watson, for example, now surpasses doctors in diagnosing lung cancer.
The reason: Watson, and other computers, "take advantage" of the Moore's Law. This has the side effect that the costs per unit of computing capacity are halved every 24 months, whereas those of humans have remained almost the same for thousands of years. It doesn't take IBM Watson to see that not long after now, Kurzweil thinks in the late 2020s, even the smartest person is surpassed in computing capacity by the average laptop. In short: it might be a good idea to upgrade our 'wetware' with fast hardware. What are the possibilities?
One idea is to somehow tie our brains to a chip. This concept is called brain-computer interface (BCI). The problem here is that the switch cells in our brain, the neurons, work completely differently than transistors on a chip. Neurons work with vibrations and frequencies, transistors are on or off (1 or 0). So there must be an interface between chip and cell, which translates chip codes into signals for the neurons. Reading or passing on simple signals will work. It has already been possible to show the blind in this way. Unfortunately, the researcher documented, William H. Dobelle, insufficient, so that much progress had disappeared by the time of his death in 2006. It was deeply sad that his patients slowly became blind again, because it was not clear how the implants functioned. Infections forced doctors to remove them. In the Netherlands, a lot of brain-computer interface research is carried out in the Donders Institute, connected to the Nijmegen academic hospital. In Belgium at the universities of Ghent and Liege.
Neuroprostheses replace parts of the nervous system with (usually) electronics. Since the 1990s, neuroprostheses have been able to read signals from the brain via neurons and convert them into mechanical movements, or even read images of the eyes. Biological systems, and certainly the networks of nerve cells that make up our brain and nervous system, are each unique. This makes the development of effective neuroprostheses that are more complicated than transmitting on / off signals very complicated. Fortunately, researchers have already succeeded cochlear implants to be developed, with which 400,000 deaf people can hear again. Artificial limbs, which are sensitive to nerve impulses, are also becoming increasingly popular.
Unfortunately, it is not yet possible to implant a chip that helps you with, for example, calculating or consulting the internet. In 2015, we still have too little insight into the precise functioning of the brain for that.
Not everyone likes to walk around in the skull with wires and electronics. Systems where, for example, brainwaves can be read and you can control a robot arm, are much more user-friendly. These systems are also there, although it takes a lot of effort to get a computer system to know the biological wiring of the brain of the wearer. In 2014, scientists managed to recognize individual words by analyzing brainwave patterns. This also requires a lot of training, so don't worry (for now) that spies can read your mind from a distance. In that respect, you better worry about software that recognizes body language.