Advances in Affective Computing Expected to be Game Changers in Many Verticals !
“Any sufficiently advanced technology is indistinguishable from magic.”
--Arthur C. Clarke
Affective computing is nothing new – it has been around for awhile – over a decade, in fact. As defined by MIT’sMedia Lab, affective computing is computing “that relates to, arises from, or deliberately influences emotion or other affective phenomena.”
Market research firm MarketsandMarkets predicts the affected computing market will grow from $9.35 billion in 2015 to $42.5 billion worldwide by 2020; it also broadly segments the affective computing market on the basis of technologies, some of which include by software, speech/gesture/facial expression recognition, neural analysis, and touch-based; and by hardware, components comprising sensors, cameras, storage devices and processors.
And we’re now seeing extensive applications in numerous verticals such as academia and research, entertainment, fashion, government and defense, healthcare/life sciences, consumer electronics, security, travel, and more.
“With the advance of affective computing technology, researchers are able to objectively identify and measure a learner’s affective status during the entire learning process in a real-time manner, and then they are able to understand the interrelationship between emotion, motivation and learning performance,” noted authors Chih-Hung Wu, Yueh-Min Huang and Jan-Pan Hwang in an article on affective computing published in the British Journal of Educational Technology.
Ahmedabad, India-based Fibre2fashion builds B2B platforms for the global textile-apparel and fashion industry. The company said with affective computing, apparel e-tailers can track human emotions and use the data to develop garments, sales, marketing and service.
“By reading emotions of consumers, they can automatically adapt merchandise in real time depending on the mood of the shopper. By measuring emotion data gathered by audiences watching the streaming of a live fashion show, apparel e-tailers can send such responses to the buying and collection departments,” said Fibre2fashion.
And by identifying the right emotions, Fibre2fashion added that e-tailers can understand perceptions and what consumers like to wear, when, and which time of the year. The potential is enormous – designers can gauge audience responses to see which products are generating interest.
And in Switzerland, researchers at the Ecole Polytechnique Federal de Lausanne (EPFL), teamed up with PSA Peugeot Citroen to develop an on-board emotion detector that reads various facial expressions – fear, anger, joy, sadness, disgust, surprise or suspicion. Embedded cameras film their faces; the data may prove useful in determining the emotional state of the driver and the risk factors associated with a wide array of emotions.
One UK company, Emoshape Ltd., raised funds on Indiegogo to launch EmoSPARK, an artificial intelligence console built around an EPU (Emotional Processing Unit) microchip, a patent pending technology that creates a synthesized emotional response in machines. The technology, according to Emoshape, allows a robotic toy or an IoT device to create a completely unique personality.
Computers are increasingly being used more and more to increase our understanding of human emotions. PSFK Labs, a trends research firm, cogently summed up how affective computing will play a larger role in our lives:
“The beauty of human interactions (and the bane of automated ones) is that, being human, we’re able to empathize with one another. We can draw on our own experiences and shape our interactions accordingly. It’s a special gift that makes living together in large societies possible, and often enjoyable. As technology grows to take on an ever-increasing role in our lives, maybe it’s time that we start building systems which learn to empathize with us as well.”
This article is written by: Neal Leavitt
0 comments: