Feel the emotions of humans, robots should not take heart |

Xin Zhi Zao: Pascale Fung, author of this article, source weforum, compiled by Xin Zhi Zao, and refused to reprint without permission! He recently for the popular science magazine "Scientific American" wrote an article called "" take heart "robot." In the article, the author describes the addition of "empathic modules" to robot programming, so that they can better satisfy human emotional needs and physiological needs. After the article was published, some readers were curious as to how we could apply this kind of robot that could feel emotions to medicine and other fields of life, while others expressed strong opposition to robots' understanding of human emotions. One of the readers' views is very sharp: "People are human because they have emotions. Therefore, robots really cannot have human characteristics, so they will replace humans and do what humans can do." Of course, The public opinion is not one-sided, and some readers are very interested in this idea: "If robots have wisdom and can feel human emotions, then at some point in the future, can they have self-consciousness?"

For us, understanding robot intelligence is very important. But first, we need to understand why humans have self-perception and how humans feel themselves.

Human emotions

What kind of role does emotion play in the evolution of a species? Research shows that the connection between people is based on friendly relationships. The survival of species depends on this close association, and most of the time, this association is supported by emotions. We can also use emotion as a signal to convey our intentions.

Our feelings and emotions are all generated by certain stimuli, whether external or internal (such as memory). Therefore, our various physiological characteristics can make responses after stimulation, such as pulse beats, sweat, Facial expressions, gestures, tones and more. We may cry, maybe we may laugh, we may be sick and we may be afraid to fear our feet. These emotions are not like language. They are spontaneous and automatically expressed. There is no subconscious control. From the moment we were born, we began to learn to understand other people's emotions. Before the baby is born, gently singing lullabies will completely relax them. At birth, they can respond to the smiling faces of their parents, and from the very beginning they can express their emotions.

Machine emotion

Industrial robots make cars and smart phones; Rehabilitation robots help patients walk again; Teaching assistant robots can answer students' questions; Software programs can write legal documents and even score your composition; Software systems can be newspapers Storytelling; artificial intelligence program AlphaGo defeated Lee Se-shi of Go Division; IBM Waston defeated mankind in the puzzle quiz “The Edge of Danger”; robots can even create beautiful paintings that make people mistakenly believe that they are professional Artists paint; machines can also create music... Obviously, in some areas, robots can be built stronger, faster, and smarter than humans, but they really need to feel like we humans. Emotional ability?

In early 2016, our team announced the first system that can recognize twelve human emotions in real time from intonation. Before the project was developed, because of the existence of a program called “feature engineering” , identifying emotions from intonations caused delays in processing time, and delays in the human-computer interaction scene were unnatural. So how did our team break this barrier? - We fully understand the mechanism of robot learning.

Each robot can run on a hardware platform driven by a software algorithm. An algorithm system designed by humans can tell the robot how to respond to some stimuli. For example, how to answer a question and how to navigate in a room. This is like an architect building a house. An AI engineer can overview the overall robot's task. blueprint. Programming knowledge uses code to allow the robot to identify possible obstacles and complete tasks. One of the very big obstacles is robot learning - that is, algorithms that allow machines to learn and imitate human-like responses, such as playing chess and answering questions.

The biggest breakthrough in artificial intelligence is robot learning. It is not the result of a predictive response to a request designed in a programming algorithm. Instead, a robot can learn from many examples of stimuli in the real world. If the robot sees a lot of paintings with cats tagged with "cats," it can use any of these machine learning algorithms to identify other cats in paintings it does not see. In the same way, if a machine can browse trillions of pages and translate instances, it can learn translation skills like Google Translate.

The most important part of machine learning is learning to understand the representation of features. A cat can be represented by its outline, claws, face and body posture. Discourse can be distinguished by the frequency components of the sound. The emotions in the discourse can be identified not only by pitch but also by timbre, rhythm, and speed. Machine learning first needs to build a feature project to crawl these features. For tone, feature engineering captures between 1000 and 2500 features in a sound, a process that slows down the entire emotion recognition process. Then, these thousands of features will be dealt with by humans, each of which needs to deal with the time and process.

Recently, scientists have used machine acceleration and large amounts of learning data to make several breakthroughs in the area of ​​neural networks (aka deep learning), which has led to the rapid development of machine learning. First, some deep learning methods, such as convolutional neural networks (CNN), can automatically learn these features during the learning process without explicit, delayed feature engineering or human design . This is probably the greatest contribution of deep learning to the AI ​​field.

Going back to our system of identifying emotions in tones, what we do is replace feature engineering and classification learning with a simple convolutional neural network, which can learn very well ( even if it is not better than the classification machine, but better than Much faster because it does not require a clear and slow feature engineering process. Similarly, facial recognition can also be achieved through convolutional neural networks.

In addition, scientists are also developing ways to allow robots to control emotions by changing the pitch and controlling the synthesized facial muscles with a small engine. Sofia and Erica are two humanoid robots with facial expressions.

The connection between man and robot and the fourth industrial revolution

The fourth industrial revolution is on the horizon. Technology has replaced people's status in many areas. Technology that used to take years or even decades to learn was now obsolete overnight. At present, many people are not aware of the speed of progress in the AI ​​and robotics fields, and in the light of what they are witnessing at present, it is assumed that robots will rule the world within 30 to 50 years. This kind of robot threat theory has recently been hotly fired, and many people are sincere about the existence of robots.

In fact, this kind of prediction already existed in the previous industrial revolutions, when people feared that steam engines and computers would make human labor redundant. But it is well known that humans have been learning a variety of different technologies to manipulate these machines.

However, as more AI and robots are used, new relationships between humans and robots will emerge. For human beings, in order to reduce fear and believe in a robot that can walk, can speak, can make various postures, and can carry heavy loads, what we need to do is to achieve emotional communication with robots. What differentiates robots from other electronic devices is its advanced mechanical intelligence and emotions. For home care robots, it is very important to understand the cry of the baby or the patient's pain. For those smart robots, they need to "take heart."

So, will robots have self-consciousness in the end?

If a robot possesses analytical techniques, learning abilities, communication skills, and even emotions, will they have self-awareness? Will they have the perception ability? Will they dream?

The neural networks mentioned above are not like other robotic learning algorithms that will make humans more certain that they are the masters. Neural networks can even produce random, dreamlike images that make people believe that robots can dream.

The question is: As human beings, do we understand for ourselves what gives us the ability to perceive? Is it just a combination of sensory understanding and thinking process? AI scientists cannot answer this question, but we believe that in order to build a good robot, we need to pass on those values ​​that are ethical and ethical. These value norms can help them make decisions. With the related expansion in the area of ​​robotic intelligence, teaching these values ​​to robots is as important as educating children. At present, our next challenge is to allow the machine to automatically learn such values ​​after the robot has prior emotional understanding and communication skills.

Via weforum

IP20 Ultra Thin HCA Power

We are famous power factory in china.We mainly offer led switching power supply,dual outpout led switching power,IP67 led power,waterproof led power,IP67 Outdoor Power,IP20 Indoor Power,IP20 led power,Ultra thin led power,outdoor led power etc.They widely use for led strip light,led wall wash light,led
underwater light,led inground light,led spike light,etc.

Ip20 Ultra Thin Hca Power,Ultra Thin Power Supply,Light Box Power,Constant Voltage Light Box Power

Jiangmen Hua Chuang Electronic Co.,Ltd , https://www.jmhcpower.com

Posted on