You are here

First Place, 2010

Avatar - a Virtual Human

Sangyoon Lee, Computer Science

The research project LifeLike aims to design and develop a lifelike computer interface, called an avatar. My research has been focused on the recreation of a visually compelling digital version of a real human, Dr. Alexander Schwarzkopf, who is a long-standing program manager at NSF. The prototype system is designed to help researchers prepare for a proposal using interactive questions and answers with the avatar.

We have seen many instances of avatars in movies and video games. A single image of movie scene takes hours to render, while characters in games look too cartoonish to convey the sense of natural face-to-face human communication. However, LifeLike application renders an avatar fast enough to accommodate real time interaction as well as produce photorealistic details similar to a real person. The success of project brings us one step forward to preserve humans in natural way as opposed to knowledge in a written form.

The image is captured from real time renderings on a 4k-resolution display (4 times HDTV). The avatar in the current LifeLike application looks very close to a real human. Or, as Dr. Schwarzkopf put it, "It’s here. It looks like me." when he interacted with his avatar in Jan 2010.

Credits: This research is NSF funded collaborative efforts between Electronic Visualization Lab., Computer Science (Visualization - Jason Leigh, Andrew Johnson, Luc Renambot, Sangyoon Lee) & Communication Department (Behavioral Modeling and Evaluation - Steve Jones, Gordon Carlson), University of Illinois at Chicago and University of Central Florida (Knowledge and Speech Recognition - Avelino Gonzalez, Ronald DeMara, Victor Hung, JR Hollister, Miguel Elvir).