20+ Emotion Recognition APIs That Will Leave You Impressed, and Concerned

If businesses could sense emotion using tech at all times, they could capitalize on it to sell to the consumer in the opportune moment. Sound like 1984? The truth is that it’s not that far from reality. Machine emotional intelligence is a burgeoning frontier that could have huge consequences in not only advertising, but in new startups, healthcare, wearables, education, and more.

There’s a lot of API-accessible software online that parallels the human ability to discern emotive gestures. These algorithm driven APIs use use facial detection and semantic analysis to interpret mood from photos, videos, text, and speech. Today we explore over 20 emotion recognition APIs and SDKs that can be used in projects to interpret a user’s mood.

How Do Emotion Recognition APIs Work?

nviso-example-2Emotive analytics is an interesting blend of psychology and technology. Though arguably reductive, many facial expression detection tools lump human emotion into 7 main categories: Joy, Sadness, Anger, Fear, Surprise, Contempt, and Disgust. With facial emotion detection, algorithms detect faces within a photo or video, and sense micro expressions by analyzing the relationship between points on the face, based on curated databases compiled in academic environments.

To detect emotion in the written word, sentiment analysis processing software can analyze text to conclude if a statement is generally positive or negative based on keywords and their valence index. Lastly, sonic algorithms have been produced that analyze recorded speech for both tone and word content.

Use Cases For Emotion Recognition

Smile — you’re being watched. The visual detection market is expanding tremendously. It was recently estimated that the global advanced facial recognition market will grow from $2.77 Billion in 2015 to $6.19 Billion in 2020. Emotion recognition takes mere facial detection/recognition a step further, and its use cases are nearly endless.

An obvious use case is within group testing. User response to video games, commercials, or products can all be tested at a larger scale, with large data accumulated automatically, and thus more efficiently. Bentley used facial expression recognition in a marketing campaign to suggest car model types based on emotive responses to certain stimuli. Technology that reveals your feelings has also been suggested to spot struggling students in a classroom environment, or help autistics better interact with others. Some use cases include:

  • Helping to better measure TV ratings.
  • Adding another security layer to security at malls, airports, sports arenas, and other public venues to detect malicious intent.
  • Wearables that help autistics discern emotion
  • Check out counters, virtual shopping
  • Creating new virtual reality experiences

Facial Detection APIs that Recognize Mood

These computer vision APIs use facial detection, eye tracking, and specific facial position cues to determine a subject’s mood. There are many APIs that scan an image or video to detect faces, but these go the extra mile to spit back an emotive state. This is often a combination of weight assigned to 7 basic emotions, and valence — the subject’s overall sentiment.

1: Emotient

Emotient is great for an ad campaign that wants to track attention, engagement, and sentiment from viewers. The RESTful Emotient Web API can be integrated into apps, or used to help power AB testing. In addition to the API, there’s a good account analytics panel.

emotient-analytics

2: Affectiva

With 3,289,274 faces analyzed to date, Affectiva is another solution for massive scale engagement detection. They offer SDKs and APIs for mobile developers, and provide nice visual analytics to track expressions over time. Visit their test demo to graph data points in response to viewing various ads.

3: EmoVu

Produced by Eyeris, EmoVu facial detection products incorporate machine learning and micro expression detection that allow an agency to “accurately measure their content’s emotional engagement and effectiveness on their target audience.” With a Desktop SDK, Mobile SDK, and an API for fine grained control, EmoVu offers wide platform support, including many tracking features, like head position, tilt, eye tracking, eye open/close, and more. They offer a free demo with account creation.

Eyeris-demo
Looking for APIs? Check out API Discovery: 11 Ways to Find APIs

4: Nviso

Switzerland-based Nviso specializes in emotion video analytics, using 3D facial imaging tech to monitor many different facial data points to produce likelihoods for 7 main emotions. Though no free demo is offered, Nviso claims to provide a real-time imaging API. They have a reputation, awarded for smarter computing in 2013 by IBM. With its international corporate vibe, Nviso may not be the choice for a developer looking for a quick plug-in-play ability with immediate support.

5: Kairos

kairos-logoThe Emotion Analysis API by Kairos is a more SaaS-y startup in the facial detection arena. Scalable and on-demand, you send them video, and they send back coordinates that detect smiles, surprise, anger, dislike and drowsiness. They offer a Free Demo (no account setup required) that will analyze and graph your facial responses to a few commercial ads.

The sleek-branded Kairos could be a developer favorite. It looks newly supported with a growing community, with transparent documentation for its Face Recognition API , Crowd Analytics SDK, and Reporting API. The Emotion Analysis API just recently went live.

6 : Project Oxford by Microsoft

Microsoft’s Project Oxford is a catalogue of artificial intelligence APIs focused on computer vision, speech, and language analysis. After the project’s age-guessing tool went viral last year for it’s “incongruities,” some may be reluctant to try Microsoft’s emotion detection capabilities (this is the app that thought Keanu was only 0.01831 sad).

Nordic APIs founders Travis Spencer and Andreas Krohn - 99% happy

Nordic APIs founders Travis Spencer and Andreas Krohn – 99% happy

The API only works with photos. It detects faces, and responds in JSON with ridiculously specific percentages for each face using the core 7 emotions, and Neutral. Truncate the decimals and this would be a very simple and to the point API, a very useful tool given the right situation. Upload a photo to the free online demo here to test Project Oxford’s computer vision capabilities.

7: Face Reader by Noldus

Used in the academic sphere, the Face Reader API by Noldus is based on machine learning, tapping into a database of 10,000 facial expression images. The API uses 500 key facial points to analyze 6 basic facial expressions as well as neutral and contempt. Face Reader also detects gaze direction and head orientation. Noldus seems to have a solid amount of research backing its software.

face-reader-noldus

8: Sightcorp

Sightcorp is another facial recognition provider. Their Insight SDK offers wide platform support, and tracks hundreds of facial points, eye gaze, and has been used in creative projects, museum showcases, and at TEDX Amsterdam. Sightcorp’s F.A.C.E. API (still in beta) is an cloud analysis engine for automated emotional expression detection.

Sightcorp-example-insight-sdk

9: SkyBiometry

SkyBiometry-logoSkyBiometry is a cloud-based face detection and recognition tool which allows you detect emotion in photos. Upload a file, and SkyBiometry detects faces, and senses the mood between happy, sad, angry, surprised, disgusted, scared, and neutral, with a percentage rate for each point. It accurately determines if a person is smiling or not. A benefit to Skybiometry is that it’s a spin off of a successful biometric company — so the team’s been around for a while. Check out their free demo to see how it works, and view their extensive online API documentation.

10: Face++

From their developer center, the onboarding process for Face++ looks intuitive. Face++ is more of a face recognition tool that compares faces with stored faces — perfect for name tagging photos in social networks. It makes our list because it does determine if a subject is smiling or not. Face++ has a wide set of developer SDKs in various languages, and an online demo.

faceplus-example

11: Imotions

Imotions is a biometrics research platform that provides software and hardware for monitoring many types of bodily cues. Imotions syncs with Emotient’s facial expression technology, and adds extra layers to detect confusion and frustration. The Imotions API can monitor video live feeds to extract valence, or can aggregate previously recorded videos to analyze for emotions. Imotion software has been used by Harvard, Procter & Gamble, Yale, the US Air Force, and was even used in a Mythbusters episode.

Imotions-example

12: CrowdEmotion

CrowdEmotion offers an API that uses facial recognition to detect the time series of the six universal emotions as defined by Psychologist Paul Ekman (happiness, surprise, anger, disgust, fear and sadness). Their online demo will analyze facial points in real-time video, and respond with detailed visualizations. They offer an API sandbox, along with free monthly usage for live testing. Check out the CloudEmotion API docs for specific information.

CrowdEmotion-demo-API

13: FacioMetrics

Founded at Carnegie Mellon University (CMU), FacioMetrics is a company that provides SDKs for incorporating face tracking, pose and gaze tracking, and expression analysis into apps. Their demo video outlines some creative use cases in virtual reality scenarios. The software can be tested using the Intraface iOS app.

faciometrics-intraface-app

Findface

The Findface software utilizes the NtechLab face recognition algorithm to recognize 7 basic emotions as well as 50 complex attributes. It purportedly has a degree of 94% accuracy recognizing 7 emotions: joy, surprise, sadness, anger, disgust, contempt, and fear. Note: Findface does not offer a web API for the emotive recognition, however, it does provide a powerful SDK.

Text to Emotion Software

There are many sentiment analysis APIs out there that provide categorization or entity extraction, but the APIs listed below specifically respond with an emotional summary given a body of plain text. Some keywords to understand here are natural language processing — the use of machines to detect “natural” human interaction, and deep linguistic analysis — the examination of sentence structure, and relationships between keywords to derive sentiment.

You could use these APIs to do things like inform social media engagement analytics, add new features to chat messaging, perform targeted news research, detect highly negative/positive customer experiences, or optimize publishing with AB testing.

14: IBM Watson

Powered by the supercomputer IBM Watson, The Tone Analyzer detects emotional tones, social propensities, and writing styles from any length of plain text. The API can be forked on GitHub. Input your own selection on the demo to see tone percentile, word count, and a JSON response. The IBM Watson Developer Cloud also powers other cool cognitive computing tools.

IBM watson

15: Receptiviti

Backed by decades of language-psychology research, the Receptiviti Natural Language Personality Analytics API uses a process of target words and emotive categories to derive emotion and personality from texts. Their Linguistic Inquiry and Word Count (LIWC) text analysis process is even used by IBM Watson. With REST API endpoints and SDKs in all major programming languages, Receptiviti looks both powerful and usable.

16: AlchemyAPI

The Alchemy API scans large chunks of text to determine the relevance of keywords and their associated negative/positive connotations to get a sense of attitude or opinion. You can enter a URL to receive a grade of positive, mixed, or negative overall sentiment. Though it’s more for defining taxonomies and keyword relevance, the tool does offer an overall sentiment evaluation for the document. Check out the demo or Sentiment Analysis API docs.

17: Bitext

The Text Analysis API by Bitext is another deep linguistic analysis tool. It can be used to analyze words relations, sentences, structure, and dependencies to extract bias with its “built-in sentiment scoring” functionality.

sentiment_analysis_screen-bitext

18: Mood Patrol

Hosted on the Mashape API marketplace, Mood Patrol by Soul Hackers Labs is a simple API that extracts emotion from text. Good for analyzing small sections of text for cues, and responding with fine grained adjectives that describe the emotional tone based on Plutchik’s 8 Basic Emotions. Visit the Soul Hackers demo or API documentation.

19: Synesketch

Synesketch is basically the iTunes artwork player for the written word. It’s an innovative open source tool that analyzes text for sentiment, and converts emotional tone into some awesome visualizations. Talk about emotional intelligence — “[Synesketch] code feels the words”, dynamically representing text in animated visual patterns so as to reveal underlying emotion. A few third-party apps have already been constructed with this open source software to recognize and visualize emotion from Tweets, speech, poetry, and more.

Synesketch

20: Tone API

The Tone API is a speedy SaaS API built for marketers to quantify the emotional response to their content. The tool takes a body of text and analyzes for emotional breadth, intensity, and comparison with other texts. Looks to be a cool service for automating in-house research to optimize smart content publishing.

tone-api-example

21: Repustate API

The Repustate Sentiment Analysis process is based in linguistic theory, and reviews cues from lemmatization, polarity, negations, part of speech, and more to reach an informed sentiment from a text document. Check out info on their Text Analytics API.

Speech to Emotion Software

Lastly, humans also interact with machines via speech. There are plenty of speech recognition APIs on the market, whose results could be processed by other sentiment analysis APIs listed above. Perhaps this is why an easy-to-consume web API that instantly recognizes emotion from recorded voice is rare. Use cases for this tech could be:

  • Monitoring customer support centers
  • Providing dispatch squads automated emotional intelligence

22: Good Vibrations

The Good Vibrations API senses mood from recorded voice. The API and SDK use universal biological signals to perform a real time analysis of the user’s emotion to sense stress, pleasure, or disorder.

They’re not really web APIs, but EMOSpeech is an enterprise software application that allows call centers to analyze emotion, and Audeering software detects emotion, tone, and gender in recorded voice.

23: Vokaturi

Vokaturi software purportedly can “understand the emotion in a speaker’s voice in the same way a human can.” With the Open Vokaturi SDK, developers can integrate Vokaturi into their apps. Given a database of speech recordings, the Vokaturi software will compute percent likelihoods for 5 emotive states: neutrality, happiness, sadness, anger, and fear. They provide code samples for working in C and Python.

Conclusion: The Future of Emotion Recognition

Machine emotional intelligence is still evolving, but the future could soon see targeted ads that respond to not only our demographic (age, gender, likes, etc.) but to our current emotional state. For point of sale advertising, this information could be leveraged to nudge sales when people are most emotionally vulnerable, getting into some murky ethical territory. Emotional recognition via facial detection is also shady if the user isn’t aware of their consent to be recorded visually. There are of course data privacy legalities any API provider or consumer should be aware of before implementation.

We are only on the tip of the iceberg when it comes to machine human interaction, but cognitive computing technologies like these are exciting steps toward creating true machine emotional intelligence.

Did we leave out any good Emotion Recognition APIs? Respond below or add to this Product Hunt list.