Brian Henson - Hollywood CA, US Jeff Forbes - Valencia CA, US Michael Babcock - Los Angeles CA, US Glenn Muravsky - Studio City CA, US John Criswell - Tujunga CA, US
International Classification:
G06T 13/00
US Classification:
345473, 345419, 345420
Abstract:
A real-time method for producing an animated performance is disclosed. The real-time method involves receiving animation data, the animation data used to animate a computer generated character. The animation data may comprise motion capture data, or puppetry data, or a combination thereof. A computer generated animated character is rendered in real-time with receiving the animation data. A body movement of the computer generated character may be based on the motion capture data, and a head and a facial movement are based on the puppetry data. A first view of the computer generated animated character is created from a first reference point. A second view of the computer generated animated character is created from a second reference point that is distinct from the first reference point. One or more of the first and second views of the computer generated animated character are displayed in real-time with receiving the animation data.
System And Method Of Animating A Character Through A Single Person Performance
Brian Henson - Hollywood CA, US Jeff Forbes - Valencia CA, US Michael Babcock - Los Angeles CA, US Glenn Muravsky - Studio City CA, US John Criswell - Tujunga CA, US
Assignee:
The Jim Henson Company, Inc. - Hollywood CA
International Classification:
G06T 15/70
US Classification:
345474, 345473
Abstract:
A method of animating a computer generated character in real-time through a single person performance is disclosed. The method provides a mobile input device configured to receive a hand puppetry movement as an input from a performer. Further, the method provides a motion capture device that includes a plurality of markers. The motion capture device is configured to be worn on the body of the performer. Motion capture data is received at a computer. The motion capture data is representative of the positions of the plurality of markers. In addition, input device data is received from the mobile input device at the computer. A computer generated animated character is then generated, the body movements of the character being based on the motion capture data, and head and facial movements being based on the input device data received from the mobile input device.