Mental health monitoring through 'selfie' videos, social media tracking
- Date:
- January 29, 2015
- Source:
- University of Rochester
- Summary:
- An innovative approach to turn any computer or smartphone with a camera into a personal mental health monitoring device has been created by researchers. The computer program can analyze "selfie" videos recorded by a webcam as the person engages with social media, to extract a number of "clues," such as heart rate, blinking rate, eye pupil radius, and head movement rate.
- Share:
Researchers at the University of Rochester have developed an innovative approach to turn any computer or smartphone with a camera into a personal mental health monitoring device.
In a paper to be presented this week at the American Association for Artificial Intelligence conference in Austin, Texas, Professor of Computer Science Jiebo Luo and his colleagues describe a computer program that can analyze "selfie" videos recorded by a webcam as the person engages with social media.
Apps to monitor people's health are widely used, from monitoring the spread of the flu to providing guidance on nutrition and managing mental health issues. Luo explains that his team's approach is to "quietly observe your behavior" while you use the computer or phone as usual. He adds that their program is "unobtrusive; it does not require the user to explicitly state what he or she is feeling, input any extra information, or wear any special gear." For example, the team was able to measure a user's heart rate simply by monitoring very small, subtle changes in the user's forehead color. The system does not grab other data that might be available through the phone -- such as the user's location.
The researchers were able to analyze the video data to extract a number of "clues," such as heart rate, blinking rate, eye pupil radius, and head movement rate. At the same time, the program also analyzed both what the users posted on Twitter, what they read, how fast they scrolled, their keystroke rate and their mouse click rate. Not every input is treated equally though: what a user tweets, for example, is given more weight than what the user reads because it is a direct expression of what that user is thinking and feeling.
To calibrate the system and generate a reaction they can measure, Luo explained, he and his colleagues enrolled 27 participants in a test group and "sent them messages, real tweets, with sentiment to induce their emotion." This allowed them to gauge how subjects reacted after seeing or reading material considered to be positive or negative.
They compared the outcome from all their combined monitoring with the users' self reports about their feelings to find out how well the program actually performs, and whether it can indeed tell how the user feels. The combination of the data gathered by the program with the users' self-reported state of mind (called the ground truth) allows the researchers to train the system. The program then begins to understand from just the data gathered whether the user is feeling positive, neutral or negative.
Their program currently only considers emotions as positive, neutral or negative. Luo says that he hopes to add extra sensitivity to the program by teaching it to further define a negative emotion as, for example, sadness or anger. Right now, this is a demo program they have created and no "app" exists, but they have plans to create an app that would let users be more aware of their emotional fluctuations and make adjustments themselves.
Luo understands that this program and others that aim to monitor an individual's mental health or well-being raise ethical concerns that need to be considered. He adds that using this system means "effectively giving this app permission to observe you constantly," but adds that the program is designed for the use of the user only and does not share data with anyone else unless otherwise designated by the user.
Story Source:
Materials provided by University of Rochester. Note: Content may be edited for style and length.
Cite This Page: