Thursday 3 December 2015

Facial Recognition Project Allows Computers To 'See Your Mood' by Microsoft




Facial recognition makes a 'smarter app' i.e. one that can serve you better depending on your mood - image credit: Microsoft 

   Facial recognition technologies have had their ups and downs. Major big name vendors have toyed with trials and even progressed facial recognition to the point where it could be used to replace passwords. But it turns out that computers can’t always tell the difference between a picture of someone and a real image.
Other failures in this space have seen body parts as non-physiognomy related as a foot being mistaken for a face.
Physiognomy with emotion
   But this is 2015… and this year has seen a glut of innovations in machine learning, deep neural networks, artificial intelligence and quantum computing i.e. all of which are advances designed to massively increase the processing power, speed and (quite crucially) the complexity of the calculations being made.
   We now call upon artificial intelligence in our software applications to be able to identify things like sounds, words, images and ultimately facial expressions.
   Microsoft Project Oxford is a software application development programme currently approaching public beta stage for programmers. These tools are intended to allow programmers to build applications that are ‘smart’ enough to be able to detect the emotion on your face and tell you if you are happy, sad, angry, frustrated and so on.
According to Microsoft, “In the case of something like facial recognition, the system can learn to recognize certain traits from a training set of pictures it receives, and then it can apply that information to identify facial features in new pictures it sees.”
We want our apps smarter, apparently
    But why bother? The answer is that word smart again. We now demand ‘smart apps’ on our devices. We want apps that are location-aware, temporally-aware (an awareness of time based on what our schedule is supposed to look like) and perhaps even motion-aware (so they know if we are on the move) — so why not emotion aware?
    If your smartphone knows you are unhappy, wouldn’t you like it to play your favorite song choices, or at least offer you the option? If your smartphone knows you are tired (based on facial recognition and movement tracked by accelerometer perhaps), then wouldn’t you like taxi suggestions or other options for getting home?
   Chris Bishop, head of Microsoft Research Cambridge in the United Kingdom, showed off the emotion tool earlier today in a keynote talk at Future Decoded, a Microsoft conference on the future of business and technology.

No comments:

Post a Comment