Semi Automatic Retargeting for Facial Expressions of 3D Characters with Fuzzy logicBased on Blendshape Interpolation
Abstract
To produce a 3D virtual character's face expression of human’s natural face expressions, facial motion capture is the technique considered to be the most effective one, especially in terms of production speed. However, there are still some results showing that the expression is not so expressive, especially on the side of the 3D character which has a different facial features than the real models regarding to the application of it. In this research, the correction of the basic expressions of faces in the process of facial motion retargeting was done by using blendshape interpolation method that was based on fuzzy logic. Blendshape interpolation method is the method used to combine multiple shapes into one blend with the concept of interpolation. In this research, the process of blendshape meets the concept of linear interpolation which the movement of a point of vertexon blendshape used straight lines . Blendshape method will be run as a proofreader on the results of retargeting process. Theweighting of blendshape will be assigned automatically from the results of the calculation of fuzzy logic, which refers to the input of the marker position of the facial motion retargeting. This weight is then used to provide improvements to create more expressive expressions. This process will be easier and faster to do than doing customize one by one at the vertex point manually. To avoid the appearance of irregular motion (haphazard movement), it is necessary to give the limitation of the weight (weight constraint) with range of [0,1].
Keywords : Blendshape, retargeting, fuzzy logic, facial motion capture.
Downloads
References
Erika S. Chuang, Analysis, Synthesis, and Retargeting of Facial Expressions, Stanford University, 2004.
Thomas W. Sederberg and Friends, 2D Shape Blending: An Intrinsic Solution to the Vertex Path Proble,Computer Graphics,SIGGRAPH, 1993.
Cristobal Curio, Martin Breidt, Mario Kleiner, Quoc C. Vuong, and Martin A. Giese, Heinrich H. Buelthoff, Semantic 3D Motion Retargeting for Facial Animation, University Clinic Tuebingen, Germany, 2006.
Mike B. Robertson, The Talking Head, Computer Graphics World,SIGGRAPH, 1988.
V. Orvalho,P. Bastos, P. Parke, B. Oliveira, dan X. Alvare, A Facial Rigging Survey,Eurographics, 2012.
Qing Li and Zhigang Deng, Orthogonal-Blendshape-Based EditingSystem for Facial Motion Capture Data, University of Houston, 2008.
Andrew Patrick S., Muscle-Based Facial Animation Using Blendshape in Superposition, Texas A&M University Thesis, 2006.
J.P Lewis and Ken Anjoyo, Direct Manipulation Blendshapes,IEEE Computer Society, 2010.
Roger Jang, Jyh Shing,Neuro Fuzzy and Soft Computing, Prentice Hall Inc., Upper Saddle River, New Jersey, 1997.
Hosyi'ah Rusdiana, Database Arrangement of Javanese-Indonesian Female Facial Expression BAsed on 3D Marker for Emotion Recognition,Final Project,Electrical Engineering Department of ITS, Surabaya, 2013.
Surya Sumpeno, A Natural-Language Based Affective Interface, Dissertation, Electrical Engineering Department of ITS, Surabaya, 2011.
A. Murat Tekalp and Jorn Ostermann, Face and 2D Mesh Animation in MPEG-4, Signal Processing: Image Communication 15 (2000) 387-421, USA, 2000.
Paul Ekman, Friessen, Wallace V. Friesen, and Silvan S. Tomkins, Facial Affect Scoring Technique: A First Validity Study, Semiotica III, 1971.
Robert Plutchik, The Nature of Emotion,American Scientist, Volume 89: 2001.
The copyright to this article is transferred to Politeknik Elektronika Negeri Surabaya(PENS) if and when the article is accepted for publication. The undersigned hereby transfers any and all rights in and to the paper including without limitation all copyrights to PENS. The undersigned hereby represents and warrants that the paper is original and that he/she is the author of the paper, except for material that is clearly identified as to its original source, with permission notices from the copyright owners where required. The undersigned represents that he/she has the power and authority to make and execute this assignment. The copyright transfer form can be downloaded here .
The corresponding author signs for and accepts responsibility for releasing this material on behalf of any and all co-authors. This agreement is to be signed by at least one of the authors who have obtained the assent of the co-author(s) where applicable. After submission of this agreement signed by the corresponding author, changes of authorship or in the order of the authors listed will not be accepted.
Retained Rights/Terms and Conditions
- Authors retain all proprietary rights in any process, procedure, or article of manufacture described in the Work.
- Authors may reproduce or authorize others to reproduce the work or derivative works for the author’s personal use or company use, provided that the source and the copyright notice of Politeknik Elektronika Negeri Surabaya (PENS) publisher are indicated.
- Authors are allowed to use and reuse their articles under the same CC-BY-NC-SA license as third parties.
- Third-parties are allowed to share and adapt the publication work for all non-commercial purposes and if they remix, transform, or build upon the material, they must distribute under the same license as the original.
Plagiarism Check
To avoid plagiarism activities, the manuscript will be checked twice by the Editorial Board of the EMITTER International Journal of Engineering Technology (EMITTER Journal) using iThenticate Plagiarism Checker and the CrossCheck plagiarism screening service. The similarity score of a manuscript has should be less than 25%. The manuscript that plagiarizes another author’s work or author's own will be rejected by EMITTER Journal.
Authors are expected to comply with EMITTER Journal's plagiarism rules by downloading and signing the plagiarism declaration form here and resubmitting the form, along with the copyright transfer form via online submission.