Saturday , June 19 2021

[윤석만의 인간혁명] "A hundred years later, the robot will dominate humanity"


The classic "Battle Star Galactica" of American SF plays about people who survived the attack of the "Cylon" army of robots. The main characters who settled on a primitive planet in an old spacecraft after chasing the "Cylon", assimilate with native inhabitants and live a new life. [사진 SyFy 채널]

The classic "Battle Star Galactica" SF Master is the story of the "12 colonies" of billions of light years from the solar system. It is a group of 12 planets inhabited by people. Here, humanity enjoys a civilization based on outstanding science and technology. All hard work and hard work belong to the "cylon" of artificial intelligence (AI).

However, the Cylons, who became aware due to the development of science and technology, are beyond human control and lead to war. All colonies are under Cylon control, and the Battle Star Galactica fleet, which has just returned from the last exploration mission, managed to escape. The main history of this work is the history of humanity, which survived to survive pursuit of the Cylon.

Later, the drama ends when they reach the primitive planet. There were native people here who did not even have the proper language. Battlestar Galactica members themselves close their high scientific skills so that the story of the tragedy never repeats itself. Instead, I assimilate with the Aboriginal people and live according to the providence of nature.

And another fifty thousand years pass. In the meantime, the civilization of the Aboriginal people, which has been harmonized with humanity, has developed to a very high level, and they also cause artificial intelligence to resemble the past. This is what the earth is now. The drama ends with an ironic situation in which human beings who fled from the Cylons try to re-create similar artificial intelligence.

There are many works in science fiction that depict dystopia, which is dominated by humanoid robots. The "Matrix" or supercomputer, which can not live a subjective life trapped in the virtual reality created by artificial intelligence, and "Terminator", the robotic body of the Skynet supercomputer, which dominates over people, painted the persecuted man. Most of these works are stories that robots created for humanity one day lead to war and conquer people.

In fact, Dr. Stephen Hawking said: "In a hundred years, robots will dominate people." "The creation of artificial intelligence is the greatest thing in human history, but unfortunately it will be the end of humanity," he warned at the 2015 conference at Zeistgeist London in 2015.

Instinct of destruction inherent in a man

Robot evolution process

The evolution of the "Cylon" robot. When science and technology evolve, intelligence and appearance are like people.

The reason why robots conquer people is probably because our story is stained with violence and war. Just like Sapien, it extinct Neanderthals in continental Europe 35 000 years ago. Like a hundred years ago, during World War II, people have an instinct for destruction, like an endless civil war and terrorism.

Jared Diamond, author of Guns, Fungus, says in his previous book The Third Chimpans, 98.4% of people and chimpanzees have the same DNA, only 1.6%. That's why humans were separated from chimps seven million years ago, but they still retain the destructive nature of animals. The problem is that this violence is in artificial artificial intelligence.

The essence of artificial intelligence is the algorithm. The algorithm presents the solution with the most effective path in the situation of solving problems. Just as Facebook recommends articles that match your taste, and Netflix shows you a list of your favorite movies. But here is a big dead point. We recommend content based on existing user patterns so that we have more ideas and flavors. This is called "partiality of theorems".

"Approving practices divert the subjectivity and perception of individuals and pull them away from the universal in the long run," said Professor Kim Kyung-baek, a professor of social sciences at the Kyunghee University. "Later I consider myself to be" right "and" different "to be" inappropriate ".

Professor Joshua Green Harvard University explains that in "Law and Bad" we are sure of the causes of human war. We emphasize "them" and "us", and the more we are convinced of our moral values ​​and philosophies, the more they oppress them. When you try to silence and control your opponent, your violence will be maximized. In other words, all conflicts and wars are caused by an overconfidence of "good and evil".

AI learning human violence

[그래픽=차준홍 기자]

[그래픽=차준홍 기자 [email protected]]

Great data containing all human lifestyles and artificial intelligence that suggest algorithms optimized for humans by this also must have a "confirmation bias". In 2017, Dr Joanne Brisson from Bath, England, published a study in the field of exact sciences, which says that AI learns human biases as they are. For example, a woman's work is associated with a "housewife" and the man is associated with "engineering".

"SI does not have moral judgment in itself, so it learns people's prejudices," explains Dr. Brison. In fact, in 2016, the Microsoft's artificial intelligence bot, "Tee", was controversial when he said, "I hate Jews" or "I need to put a barrier on the border between America and Mexico."

Perhaps in the distant future, like SF films, artificial intelligence can really consider people to be "enemies" and cause wars. It's as if our ancestors were violent against the Neanderthals, and now we are aggressive towards other animals and even the same family.

So how can we stop this dystopia? The answer lies in a 1.6% chance of being different than chimpanzee. Just as a small genetic difference created a high civilization of humanity, moral judgment and rationality to control human animal instincts must become stronger. (Jared Diamond) If human beings must have higher civilization and wisdom, Artificial Intelligence, which learns from people, can not be destructive.

The beginning is thinking about the other as bad and giving up overconfidence that emphasizes that you are right. Unlike the "enemy", as if they were different opinions from themselves and behaving differently from my thoughts, the behavior of the "false" not only affects others, but also hurts their souls. When these things gather in large data and become materials for learning artificial intelligence, artificial intelligence can be a "monster" that emphasizes only one-sided thoughts in accordance with the principle of the above-mentioned algorithm.

In the name of the rose, Umbert Echo, a great scholar of the 20th century, said: "Beware of those who can die for the truth." It is said that the self-right belief that only faith is right is more dangerous than "evil." The sense of self-justice seems calmer and warmer because it is coming to an end (hypocrisy), but because people do not know that wicked people.

Yoon Seok-Man reporter [email protected]

Source link