We are starting to see what the science fiction movies depicted when we were growing up – yes, technology is getting emotionally intelligent. It seems only yesterday when we are getting frustrated at our devices because they were not counterintuitive enough – those days may be coming to an end in the near future. Computers are learning more and more about what and how we are feeling—and it is becoming big business.
Growth in Emotional Technologies
Recent reports are predicting that this new global computing market is going to grow from around $12.2 billion in 2016 to as much as $53.98 billion by the year 2021. This report coming from MarketsandMarkets, who is a consulting firm, has observed that these enabling technologies are already being employed in vast range of industries. They are also seeing an increasing demand for facial feature recognition software.
In today’s tech industry artificial emotional intelligence is referred as affective computing. Even though lots of people are still very unfamiliar with this new technology, researchers from academia are already finding it has many potential uses.
A Robust Experiment for Affective Computing
Professor Toshihiko Yamasaki, from the University of Tokyo, looked at developing a machine learning process that will evaluate a TED Talk video for its level of quality. The interesting thing is that the only criteria that could determine the quality of a TED Talk is how well it connects with an actual audience. At first glance, this seems to be too subjective and too abstract for to be analyzed by a computer. But Yamasaki’s goal was to have his system to view these videos and then attempt to forecast viewer impressions. Could this really be true? Is it possible for a machine to accurately predict how persuasive a speaker will be?
Yamasaki and his team of researchers created a method that analyzed connections and “multimodal features including linguistic as well as acoustic features” from some 1,646 different TED Talk videos. And the experiment was extremely successful. This method registered a “statistically significant macro-average accuracy of 93.3 percent, outperforming several competitive baseline methods.”
Amazingly, this new technology allowed a machine to predict if a speaker would connect emotionally with a human audience. In their official report, the scientists pointed out that these results could very well be used for as recommendation and feedback tool for presenters, to help them enhance the quality of future presentations. The great thing about affective computing is that it can go way beyond how people speak to an audience. It can also revolutionize the way people learn as well.
Scientist from North Carolina State University examined the way links between the affective states of student and their learning ability. Their software accurately predicted how effective their online tutoring lessons were simply from evaluating students’ facial expressions. The software closely tracked fine-grained movements in their faces such as eyelid tightening, eyebrow raising, and even mouth dimpling to ascertain their engagement, their frustration, and their learning. The researchers commented that “analysis of facial expressions has great potential for educational data mining.”
This kind of technology is become more and more useful in the private sector. A Boston-based company called Affectiva creates emotion recognition software. Gabi Zijderveld, who is the chief marketing officer at Affectiva, recently said, “Our software measures facial expressions of emotion. So basically all you need is our software running and then access to a camera so you can basically record a face and analyze it. We can do that in real time or we can do this by looking at a video and then analyzing data and sending it back to folks.”
Amazing how Technology Is Getting Emotionally Intelligent.