emotionally interactive
|

The Human Factory's unique "EMOTION CHIP" technology can allow for rich user experiences with an interactive software agent. For example, imagine advertisements that read the user’s emotional expressions and understand those expressions as offers in an implicit negotiation about how much of the advertisement the user is willing to watch in order to get to the service (e.g., an advertisement before the YouTube video service the user truly wants to see).
Below are some example capabilities the EMOTION CHIP technology can provide, and through which smarter interactive software agents can be built.
(1) What does the user want?
A user may be unhappy about the most recent behavior of the agent, but how do we know what the user actually wants? Having quantitative values for a half dozen emotional expressions from the user leaves this question unanswered unless one understands the theoretical machinery underlying the user’s emotional expressions.
Emotionally expressive interactions between two parties evolved so that language-less social creatures could come to a compromise, and thereby avoid a costly fight. It amounts to a negotiation over something, whether it be how much cake each gets, or how much time watching an ad the user will tolerate from the interactive software agent.
Our EMOTION CHIP technology understands the mathematics underlying emotional expressions, and how an expression amounts to an offer of a compromise he is hoping the other will agree to. In addition to reading the user’s emotional expression, one can thereby glean the crucial piece of information about what compromise offer the user is actually suggesting. The interactive software agent can thereby modify its behavior (or agree to the user’s offer) accordingly, Without this information, the interactive software agent won’t even know what the user wants.
[There are related benefits as well, including being able to distinguish between what the user’s emotional expression is saying about himself (his own certainty about what he wants), versus what the user’s emotional expression is saying about the interactive software agent. For example, it is important to distinguish between the user showing pride (he himself is certain about what he wants) versus disdain (the user is claiming that the interactive software agent is less certain than the agent claims), with many concomitant semantic differences.]
(2) How serious versus casual is the user?
In a casual situation, social interactions are playful and without risk of humiliation because there’s no social capital on the line. For example, I can be disdainful toward you, but because it’s casual it’s considered teasing rather than insulting. So long as the user is expressing that he is casual (rather than serious) an interactive software agent can “get away with” more behaviors, e.g., showing more of an ad.
But if the user signals that he is serious, it means not only does he want what he wants (see (2) above), but that he’s not messing around. He is losing patience, and the interactive software agent needs to know this so that it can modify its behavior accordingly (e.g., by agreeing to what the user wants, which might mean ending the advertisement immediately).
EMOTION CHIP technology understands how to calculate the quantitative level of serious versus casual expressed in an emotion, and can thereby relate that to the interactive software agent. Without this information, the interactive software agent may have no idea that it is about to lose its customer; or, alternatively, the agent may not realize that the user is currently casual, and open to being shown more of the ad.
(3) How disagreeable versus agreeable is each party in the interaction thus far?
Social interactions can lead to embarrassment because, as the two parties “argue”, one or both has “put social capital on the line”. If the user doesn’t get his way after having signaled strong disagreement (and thereby put social capital on the line, i.e., “bet” some social capital), he will feel embarrassed, or even humiliated. A social animal implicitly understands how his expression corresponds to a bet of social capital. The more disagreeable he has been, the greater the amount of social capital he has essentially bet thus far. ...and the more difficult it can sometimes be to admit being wrong (or to agree to the other’s offer).
Social animals recognize how disagreeable each party has been thus far in an exchange, and can smartly modulate the next behavior on this basis. Sometimes it might involve engaging in conciliatory behavior, giving in to the other; such agreeable behavior lowers the social capital at stake, and thereby lowers the intensity of the discussion.
EMOTION CHIP technology can take the recognized emotional expressions of the user over the course of the interaction thus far and rigorously calculate the total disagreeability of user, and the total disagreeability of the interactive software agent (e.g., the ad). This informs the software not just about the level of seriousness versus casual (see (3) above) -- an interaction could be serious but also quite agreeable -- but, rather, about whether the discussion has gotten out of hand, in danger of the user feeling like he could be embarrassed, and leaving with a memorably bad experience. Without this, the interactive software agent may have no idea that the user perceives the interaction to have gotten into a heightened and uncomfortable argument.
~~~~~~~~~~~~~
Background on the EMOTION CHIP:
Most affective computing systems today include (at least) six dimensions -- anger, sadness, fear, disgust, happiness and surprise. Because people can mix multiple expressions together, that means there is a combinatorial explosion of possible expressive mixtures (729, in fact). But what do these more than 700 expressions actually mean? Building software to recognize a consumer’s emotional expression is a complicated challenge. But the power in doing so comes from understanding what the expressions mean, and thereby knowing how the software should respond.
EMOTION CHIP technology addresses this issue, giving rigorous meanings to emotional expressions in the context of your business.
This new groundbreaking theory is the subject of Human Factory founder Mark Changizi's sixth book, WHAT EMOTIONS MEAN.
Below are some example capabilities the EMOTION CHIP technology can provide, and through which smarter interactive software agents can be built.
(1) What does the user want?
A user may be unhappy about the most recent behavior of the agent, but how do we know what the user actually wants? Having quantitative values for a half dozen emotional expressions from the user leaves this question unanswered unless one understands the theoretical machinery underlying the user’s emotional expressions.
Emotionally expressive interactions between two parties evolved so that language-less social creatures could come to a compromise, and thereby avoid a costly fight. It amounts to a negotiation over something, whether it be how much cake each gets, or how much time watching an ad the user will tolerate from the interactive software agent.
Our EMOTION CHIP technology understands the mathematics underlying emotional expressions, and how an expression amounts to an offer of a compromise he is hoping the other will agree to. In addition to reading the user’s emotional expression, one can thereby glean the crucial piece of information about what compromise offer the user is actually suggesting. The interactive software agent can thereby modify its behavior (or agree to the user’s offer) accordingly, Without this information, the interactive software agent won’t even know what the user wants.
[There are related benefits as well, including being able to distinguish between what the user’s emotional expression is saying about himself (his own certainty about what he wants), versus what the user’s emotional expression is saying about the interactive software agent. For example, it is important to distinguish between the user showing pride (he himself is certain about what he wants) versus disdain (the user is claiming that the interactive software agent is less certain than the agent claims), with many concomitant semantic differences.]
(2) How serious versus casual is the user?
In a casual situation, social interactions are playful and without risk of humiliation because there’s no social capital on the line. For example, I can be disdainful toward you, but because it’s casual it’s considered teasing rather than insulting. So long as the user is expressing that he is casual (rather than serious) an interactive software agent can “get away with” more behaviors, e.g., showing more of an ad.
But if the user signals that he is serious, it means not only does he want what he wants (see (2) above), but that he’s not messing around. He is losing patience, and the interactive software agent needs to know this so that it can modify its behavior accordingly (e.g., by agreeing to what the user wants, which might mean ending the advertisement immediately).
EMOTION CHIP technology understands how to calculate the quantitative level of serious versus casual expressed in an emotion, and can thereby relate that to the interactive software agent. Without this information, the interactive software agent may have no idea that it is about to lose its customer; or, alternatively, the agent may not realize that the user is currently casual, and open to being shown more of the ad.
(3) How disagreeable versus agreeable is each party in the interaction thus far?
Social interactions can lead to embarrassment because, as the two parties “argue”, one or both has “put social capital on the line”. If the user doesn’t get his way after having signaled strong disagreement (and thereby put social capital on the line, i.e., “bet” some social capital), he will feel embarrassed, or even humiliated. A social animal implicitly understands how his expression corresponds to a bet of social capital. The more disagreeable he has been, the greater the amount of social capital he has essentially bet thus far. ...and the more difficult it can sometimes be to admit being wrong (or to agree to the other’s offer).
Social animals recognize how disagreeable each party has been thus far in an exchange, and can smartly modulate the next behavior on this basis. Sometimes it might involve engaging in conciliatory behavior, giving in to the other; such agreeable behavior lowers the social capital at stake, and thereby lowers the intensity of the discussion.
EMOTION CHIP technology can take the recognized emotional expressions of the user over the course of the interaction thus far and rigorously calculate the total disagreeability of user, and the total disagreeability of the interactive software agent (e.g., the ad). This informs the software not just about the level of seriousness versus casual (see (3) above) -- an interaction could be serious but also quite agreeable -- but, rather, about whether the discussion has gotten out of hand, in danger of the user feeling like he could be embarrassed, and leaving with a memorably bad experience. Without this, the interactive software agent may have no idea that the user perceives the interaction to have gotten into a heightened and uncomfortable argument.
~~~~~~~~~~~~~
Background on the EMOTION CHIP:
Most affective computing systems today include (at least) six dimensions -- anger, sadness, fear, disgust, happiness and surprise. Because people can mix multiple expressions together, that means there is a combinatorial explosion of possible expressive mixtures (729, in fact). But what do these more than 700 expressions actually mean? Building software to recognize a consumer’s emotional expression is a complicated challenge. But the power in doing so comes from understanding what the expressions mean, and thereby knowing how the software should respond.
EMOTION CHIP technology addresses this issue, giving rigorous meanings to emotional expressions in the context of your business.
- What is the consumer’s feeling about what just happened?
- What does the consumer want?
- How serious is he about it?
- What is the consumer’s opinion about the entity it is interacting with?
- How sure of himself is the consumer?
- How disagreeable is the interaction thus far?
This new groundbreaking theory is the subject of Human Factory founder Mark Changizi's sixth book, WHAT EMOTIONS MEAN.