VR Talk ~ A web-enabled speech driven talking head system


We present an approach that animates facial expressions through speech analysis. An individualized 2.5D head model is first generated by modifying a generic head model, where a set of MPEG-4 Facial Definition Parameters (FDPs) has been pre-defined. To animate facial expressions of the 3D head model, a speech analysis module is employed to obtain mouth shapes that are converted to MPEG-4 Facial Animation Parameters (FAPs) to drive the head model with corresponding facial expressions. The approach has been implemented as a real-time speech-driven facial animation system.

The web-enabled talking head is developed as plug-ins and Active-X control for internet browsers.

The detailed introduction can be found in our paper in CAS 2000 (see the publication list)

Demo of VR Talk (Experiment it on-line!!!)

VRTalk introduce a product

VRTalk with VideoVR

 

System requirement: Netscape Navigator 4.0 above version or Microsoft Internet Explorer 3.0 above version
                                       Pentium II -266 CPU and VGA cards with OpenGL hardware acceleration. 
                                      OS: MS Windows 95/98

VR Talk can trasmit virtual video control data and encoded speech in streaming with very-low bit-rate.

Copyright (c) 1999, Comunication & Multimedia Lab, NTU; CyberLink Corp.

Go back to "I-Chen's project webpage (English)"