Although the idea of creating artificial beings that do things only reserved for humans has always been with us, it was only in the XNUMXth century that it could begin to come true. A series of events that occurred in the XNUMXth and XNUMXth centuries converged so that between the end of the forties and the beginning of the 70s there was what we could call the first golden age of Artificial Intelligence.
Of course, the first thing is a dare on my part. Although historians agree that it was a golden age, it is still not clear if what we are living now is the second.
Table of Contents
the first golden age
For the spectacular boom in Artificial Intelligence research to take place after the end of the war, a series of conditions were met. as we saw in the previous article since the last century there was a methodology that allowed human reasoning to be expressed in the form of symbols and to carry out operations with them.
The works of Turing, Shannon and von Neuman for their part, they allowed the appearance of computers capable of executing complex instructions. Only the software was missing.
At first, the researchers they tried to build hardware that simulated the structure of the brain with electrical components that fulfilled the function of neurons. To understand the behavior of these, mathematical analysis was used.
Two researchers, McCulloch and Pitts, published a paper in 1943 explaining how mechanisms could be built to emulate the behavior of the brain. Today it is known that part of their approach was wrong since they believed that it was the individual neurons that made decisions based on the information obtained by the senses and generated responses (In reality millions of them are needed interacting) They did present a fairly accurate mathematical analysis of how information is transmitted between neurons.
The next great contribution came in 1949 from the hand of a physiologist from a Canadian university. Donald O. Hebb proposed the idea that neural connections are not immutable. Every time we learn something new, the neural structure changes to fix that knowledge.. The explanation is that when a neuron immediately causes the activation of another, its conductivity increases, which makes it more likely that more activations will occur, establishing new routes of neuronal connections.
The switch to the second approach (Instead of imitating the configuration of the brain, simulate the mechanism that allows it to arrive at a result) It came from the hand of a man named Marvin Minsky.
Minsky was a true Renaissance man who was interested in disciplines as diverse as Zoology and Physics through Psychology and Mathematics. With two other researchers, he built a neural network that simulated the way a rat learned how to get out of a maze.
He soon realized that although "the rat" learned from his mistakes he was unable to use his knowledge to avoid committing new ones. Hence his PhD thesis in mathematics was on how to build more complex neural networks capable of planning ahead.
The paradigm shift occurs in 1955 when Minsky meets Ray Solomonoff who was working on a theory of deductive inference. From their conversation he began to think that he was following the wrong path. That Instead of creating hardware that mimicked the makeup of the brain, it was best to try to figure out how the brain works and translate it into symbols and relationships. that can be processed by any computer.
In 1956, ten of the researchers (almost all those on the subject) met in a two-month workshop. In addition to Minsky, the aforementioned Ray Solomonoff was present.
Solomonoff, contrary to the majority opinion of his colleagues He argued that the study of the ability of computers to solve problems should be done with the least complicated ones. since this would simplify the analysis of the intervening mental processes.
Over time it proved to be good advice for another reason.. The tasks that our brain does automatically, such as recognizing a face or driving a vehicle, are very difficult to reproduce. in the form of a computer program.
Be the first to comment