DSpace Repository

Enhancing the emotional transfer in performance capture

Show simple item record

dc.contributor.advisor Gurevitch, Leon
dc.contributor.author Suzuki, Malino
dc.date.accessioned 2014-10-31T03:06:22Z
dc.date.accessioned 2022-11-03T01:36:34Z
dc.date.available 2014-10-31T03:06:22Z
dc.date.available 2022-11-03T01:36:34Z
dc.date.copyright 2014
dc.date.issued 2014
dc.identifier.uri https://ir.wgtn.ac.nz/handle/123456789/29526
dc.description.abstract In this thesis, I argue that the sophistication of motion capture can be advanced by refinement of the current method, by proposing two additional additive steps: one during capture, acquiring more emotional information such as physiological and neurological data, and the other in postproduction, enhancing methodology of retargeting to push the delivery of the performance’s emotional content. This argument is supported by research from the emotion theory domain, exploring fundamental questions about emotions, to understand what we are capturing and transferring. Current motion capture methodology which only captures the behavioral component of the performance can be improved by addressing the greater complexity of emotion which is not just behavioral, but also involves changes in physiological, neurological, and conscious experience. Deeper understanding of emotions can achieve greater accuracy in the amplification of emotional content of the motion capture performance, enhancing the communicative value of digital characters. Theories on interpersonal communication is introduced to gain perspective of the emotional and communicative intensions behind the expression. Proposition to strengthen the transmission of emotion from actor to the digital character is referred to as emotion capture, aiming to portray the essence of the character. Clarity in understanding the production and perception of emotion increases the emotional connectivity between the human observer and digital character. Through these investigations, practical implementation for enhancement of current motion capture methodology is proposed. Implementing EMFACS (Emotion FACS) into facial rigs is discussed for an easy exaggeration of emotion by the animator. EMFACS sliders increase particular sets of facial muscle movement based on emotions to help artists saturate the emotion behind performance capture. Furthermore, expanding on these ideas, I propose a potential architecture for an automated emotion capture. Introduction to affective computing (“computing that relates to, arises from, or deliberately influences emotion or other affective phenomena”) is followed by exploration of theories that best modulates the human emotion production and perception circuitry to find an optimal design for a facial production and perception architecture. Research in the automation of emotion capture is of immense interest, to push precision and speed of delivering motion capture animation. Although the field is still in its early stages, the potential is hugely inspiring for successive research to pave paths. en_NZ
dc.format pdf en_NZ
dc.language en_NZ
dc.language.iso en_NZ
dc.publisher Te Herenga Waka—Victoria University of Wellington en_NZ
dc.rights Access is restricted to staff and students only. For information please contact the library. en_NZ
dc.subject Emotion en_NZ
dc.subject Facial en_NZ
dc.subject Animation en_NZ
dc.title Enhancing the emotional transfer in performance capture en_NZ
dc.type Text en_NZ
vuwschema.contributor.unit School of Design en_NZ
vuwschema.subject.anzsrcfor 120302 Design Innovation en_NZ
vuwschema.subject.anzsrcfor 120307 Visual Communication Design (incl. Graphic Design) en_NZ
vuwschema.subject.anzsrcseo 970112 Expanding Knowledge in Built Environment and Design en_NZ
vuwschema.type.vuw Awarded Research Masters Thesis en_NZ
thesis.degree.discipline Design en_NZ
thesis.degree.grantor Te Herenga Waka—Victoria University of Wellington en_NZ
thesis.degree.level Masters en_NZ
thesis.degree.name Master of Design en_NZ


Files in this item

This item appears in the following Collection(s)

Show simple item record

Search DSpace


Browse

My Account