The avatar psychologist (right) talks to a test subject.
(Credit: Video screenshot by Leslie Katz/CNET)
It's nice to think each of us is entirely unique, a one-of-a-kind aggregate of life experiences colliding with genes that set us apart from everyone else. And while this is true to an extent, it's also true that certain telltale blueprints exist for us, all the way down to the way we move our faces if we are, say, depressed.
So researchers at the University of Southern California's Institute for Creative Technologies are developing a Kinect-driven avatar they call SimSensei to track and analyze in real time a person's facial movements, body posture, linguistic patterns, acoustics, and behaviors such as fidgeting which, taken together, signal psychological distress.
In work to be presented at the Automatic Face and Gesture Recognition conference in Shanghai later this month, Stefan Scherer and colleagues incorporated facial recognition tech and depth-sensing cameras into Microsoft's Kinect to develop the avatar psychologist. They then used the Kinect to record interviews of volunteers who had already been identified as healthy or suffering from depression or post-traumatic stress disorder to develop the code of movements and behaviors the avatar screens for.
No comments:
Post a Comment