Just in time for a new "Blade Runner", our AI is getting emotions

The statement is more poignant than ever these days, and not just because the new trailer for Blade Runner 2049 was recently released. As self-driving cars test our roads and personal assistants like Amazon Echo and Google Home aid us in our daily lives, manufacturers are also making our technology more human -- more emotional.

Softbank Pepper

The 4-foot-tall Japanese robot is being tested in roles such as a greeter in retail stores, working in banks, and helping customers select wine. It can recognize human speech and body language while making gestures of its own as it tracks your movement. Pepper can detect whether you are smiling, if you are happy or sad, and can even play a mean “air guitar”. The emotional machines were introduced in 2014 in Japan, and recently appeared at two California malls during Black Friday.

Asus Zenbo

From Fox News: “It’s meant as an assistant that, Asus claims, can hear and respond to naturally spoken commands, adapt to personal preferences using artificial intelligence, “express” emotions with different facial expressions, recognize faces, play music, control smart home devices, and do remote monitoring, among other things.”

Honda NeuV

Honda’s new concept car, the automated electric NeuV, can understand driver’s emotions and develop emotions of its own. Honda announced that it has an “emotion engine” that will “enable machines to artificially generate their own emotions.” The AI can have conversations with the driver, gauge the driver’s emotions, and it is expected that the AI will “grow up” with the driver.

Blade Runner was set in 2019 and we’re a long way off from Replicants and Voight-Kampff tests. However, as our technology advances from utilitarian designs into more human dimensions -- it feels as though we are on the cusp of an AI metamorphosis. Watch the video to see more on the Honda NeuV and its “emotion engine”.