"Nature" magazine: Association of Artificial Intelligence touch, caress intimate just around the corner

  Such a robot, you want some of it?

  Walk upright, the man freed his hands, using both labor and production tools, but also to express their emotions and create art, there are words for the card:

  Who never doubted see tears in their eyes, even Yuning choke. . .

  So, with each passing day, a ride thousands of miles of artificial intelligence, ever learn to caress intimate romantic it? Today a Nature paper on the line, MIT researchers have taken a big step in this direction.

  In this secret above glove, it looks ordinary, but the array 548 comprises a flexible piezoresistive sensors mounted thereon, and cover the whole palm of the hand, when the size of the pressure sensing touch may range between 30 to 500 mN, discrete 150 levels, a second sample can be 7.3 times, and outputs a corresponding electrical signal. In other words, this glove has a pretty good spatial and temporal resolution and accuracy of pressure, wear it stroked crawl when you can get a lot of tactile data, and cost just $ 10.

  Machine vision is already leaps and bounds, well developed, but the sense of touch is starting soon. Because of the relatively camera camera, the people had touch-sensitive data collection and there is not much demand, not many people will whimsical touch the memory stored. But with the development of robots, in particular the future demand for service robots, visual light is clearly not enough, the sense of touch is also becoming increasingly important. The gloves, just as the camera can crawl dynamic haptic data storage.

  Yet, if the hand without a brain, feelings and emotions or not. So wear gloves authors repeatedly grab 26 different objects, to obtain 135,000 tactile pressure group of spatio-temporal data, and training the neural network by convolution depth learning, taught the machine touch. Convolutional neural network architecture which follows, based ResNet-18:

  The results show that, as long as the set of temporal data input 7, it can accurately recognize different objects:

  It may be roughly estimate the weight of the object:

  In other words, the machine already has a preliminary tactile. Next to it, I think it is by touch, let the machine further learn emotions, such as joy when caressing intimacy, anger at the teeth, helpless and anxiety, can be sampled simultaneously by tactile sensing electrode and the brain, deep learning . A robot emotions and loneliness, is produced.

  Such a robot, you want some of it?

  Zhengzhou infertility hospital: http: //wapyyk.39.net/zz3/zonghe/1d427.html Zhengzhou infertility hospital Which is good: http: //wapyyk.39.net/zz3/zonghe/1d427. html Zhengzhou infertility hospital rankings: http: //wapyyk.39.net/zz3/zonghe/1d427.html

Guess you like

Origin www.cnblogs.com/sushine1/p/10950074.html