Remember “2001: A Space Odyssey” and “The Terminator”? Back in the day when computer science and the general public were realizing the full potential of artificial intelligence, futuristic movies stirred a primal fear that computers would become so smart that they could take over the world. Since then, advances in AI have been astounding. Machines now analyze visual data like scenes and faces, understand natural language, and even learn. But don’t worry about global domination, we’re told, computers are a long way from being self-aware or able to think like humans. Most importantly, they will never have the ability to feel emotions.
Maybe not, but they can read them.
Emotion AI, a.k.a. affective computing or artificial emotional intelligence, originated in the 1990s to bolster earlier attempts to identify universal emotions communicated by facial expressions. This research needed a database of standardized photographic images and the burgeoning data-mining field could comply. Then machine learning techniques were applied to big data and emotion AI took off. It wasn’t long before other visual non-verbal communication modes were studied, such as gesture, posture, gait, unconscious body movements and biophysical signals like flushing and sweating. Auditory cues in speech and voice were also analyzed, such as intonation, emphasis, rhythm and pauses. This research is becoming more sophisticated each year and the current wealth of social media offers innumerable sources of images and vocalizations.
The potential of such technology is tremendous. In addition to the obvious marketing, advertising and customer service uses, there are countless non-commercial applications. It can improve human-human interaction in public safety and mental health call centers to identify the emotional state of callers. It’s instrumental in national security screening at airports, hiring and human resource management, and policing programs that predict violence. Future developments include in-car monitoring of driver’s attention and assisted services for people with autism. Not least are improvements in machine-human interaction that make computerized responses sound more human, because no one likes talking to a computer.
If this sounds utopian, it probably is. The current state of non-verbal emotion AI has been criticized by academia scholars who argue that the scientific literature does not support the existence of either universal emotions or the reliability of inferring emotions from expressions, whether you’re a machine or human. We can misinterpret our spouse’s forced smile, feel several emotions at once or in sequence, and frown when we are actually laughing inside. Adding to a computer’s difficulties are real life conditions such as ambient noise and poor lighting, and cultural-racial differences in expression. Civil rights advocates warn of privacy issues and programming biases that favors Caucasian faces and speech. The legality of hiring practices based on non-verbal assessments and the screening of air travelers’ facial expressions have been contested in court and by the government.
Computer technologies may still have problems navigating the complexities of high-level human perceptual recognition, but there is another mode for conveying emotions: verbal language in written or spoken text.
The other promising branch of emotion AI applies natural language processing (NLP) to analyze word choice and usage. Granted, there are also challenges for NLP, but they center on semantic issues – ambiguity, irony and sarcasm, synonyms and homonyms, style and intention, situational context and the inference of world knowledge, to name a few. However, recent advances in “deep learning,” or machine learning that mimics the way humans pay attention and gain knowledge, promise to overcome them.
We witness advances in machine language understanding and production in virtual assistants like Siri, Alexa and all their cohorts. One of the most successful enterprises applying NLP and machine learning to detect human emotions is Cognovi Labs.
Cognovi’s innovative approach begins by data-mining various forms of text, such as social media, transcribed conversations, industry correspondence, or custom data from their Dynamic Diagnostic Interview. Their proprietary machine-learning technology, developed over eight years of study at an academic research center, searches for specific emotional expressions spanning the range from happiness, joy and amusement to sadness, fear and disgust.
Then humans take over. Cognovi’s team of behavioral and cognitive scientists and clinical psychologists apply their years of expertise to interpret the most prominent emotions of specific groups or populations. With their knowledge of human-decision making, they don’t just describe emotion, they predict motivations and behavioral intentions that are driven by them. If that’s not impressive enough, their Emotion Trigger Marketing (ETM) platform identifies words and longer narratives that influence the specific emotions leading to positive behavioral outcomes.
Like with non-verbal emotion AI, there are numerous applications for this technology, but with positive social-political ramifications. Cognovi’s work does help commercial interests like medical marketing, financial investing and retail advertising, but it’s also invaluable for public health campaigns, social media disinformation mitigation, and national security threat detection. Their commitment to social responsibility is stated right on their homepage: “With a powerful technology comes significant responsibility.”
And they follow through. Their Covid Panic Index first indicated the rise of fear and anxiety at the beginning of the pandemic, anticipating economic effects. Later, their Vaccines Attitudes Dashboard showed changes in awareness and acceptance, with one study revealing that BMI eligibility guidelines actually had a negative impact on overweight people’s mental health. If predicting votes is a public service, they nailed Brexit and the 2016 President election.
With artificial intelligence applications like Cognovi’s predicting and influencing the future, it doesn’t seem dystopian at all. In fact, we can all look forward to understanding more about ourselves and each other and what motivates us. Maybe then Hollywood will start producing optimistic sci-fi.
Disclosure: We might earn commission from qualifying purchases. The commission help keep the rest of my content free, so thank you!