This project is an outgrowth of my Voight-Kampff project. My goal is to use what I've learned (as well as what I'm continuing to learn) about physiological markers for emotion and focus this on better methods of facilitating communication for people who have difficulty expressing emotions though facial expressions, vocal intonation or body movements.
Currently there are a number of applications for detecting emotions through analysis of facial expressions (Microsoft's Emotion API, Empatica's facial analysis, etc). These do a truly amazing job at detecting emotional expression and their abilities are growing daily but they only work with those who have no impediments in expressing emotions.
The above technologies offer great promise in helping people learn to recognize the non-verbal signals of emotion (facilitate recognition of emotional states in others). This is an application that is being pursued, especially to help people who fall in the autism spectrum.
For those who do have difficulty (due to various issues) in outwardly expressing emotion, there does not seem to be similar efforts (at least that I've found so far - please let me know if I'm incorrect!). The primary goal of viEwMotion is to address that very issue.
There are numerous other potential applications for viEwMotion ranging from research into emotional response (especially by Citizen Scientists), game design and play (increased immersion due to game response to player emotions) and even market research (analyzing emotional response to advertisements, etc.).
Discussions
Become a Hackaday.io Member
Create an account to leave a comment. Already have an account? Log In.