Scientists from University of Maryland have created an emotion prediction algorithm called ProxEmo that allows a robot to “predict the perceived emotions of a pedestrian from walking gaits.”

The analysis is done in 6 steps.

A moving robot is equipped with the commodity camera that first captures the movement of a person through a video stream. After that it continues to the “pose extraction and tracking” phase. The movement then is embedded into an image and passed into a proximal model of emotion classification. Currently, the algorithm can distinguish between 4 emotions: “happy, sad, angry and neutral.” The data undergoes the “proxemic fusion” step and finally passed into the robot navigation system.

Rather than a technology for today, the ProxEmo is envisioned by its creators as the first step towards a world where robots would be able to help people and doing so knowing the emotional state of a person to help better.

If facial recognition hasn’t been scary enough, this new motion analysis technology can have potential privacy implications in the future adding a new layer of data that can be collected. Let’s hope that this technology doesn’t end up in the wrong hands with dystopian ambitions.

Leave a comment

Your email address will not be published. Required fields are marked *