DefinePK hosts the largest index of Pakistani journals, research articles, news headlines, and videos. It also offers chapter-level book search.
Title: Automated Facial Animation Using Marker Point for Motion Extraction
Authors: Mehran Syed, Zeeshan Bhatti, Azar Akbar Memon, Zia Ahmed Shaikh, Ahmed Muhammad Sheikh, Nisar Ahmed Memon
Journal: Liaquat Medical Research Journal
Publisher: Liaquat University of Medical and Health Sciences, Jamshoro
Country: Pakistan
Year: 2024
Volume: 6
Issue: 4
Language: en
DOI: 10.38106/LMRJ.2024.6.4-07
Keywords: AlgorithmcovarianceMahalanobis distanceTrackerHuff-transform
In this research work, an automated 3D face expression generation technique is presented, which is extracted from real life video of face motion. The face expression is extracted from real human face using Huff-transform algorithm to gate the value of x coordinate and y coordinate, Covariance Matrix for detecting face marker points and Mahalanobis Distance to calculate the distance of each marker points within frames. The technique of tracking points on face uses markers placed on key positions of face muscles, then by getting its position from all frames of pre recoded face video using the distance algorithm the movement of each face muscle is detected and measured. The face muscles are marked with particular tracking markers that are detected and tracked by the system. This tracking occurs by using color segmentation, where we detect color of points and track the location and distance of each tracker points. The original and translated position values of each marker points are obtained and recorded in text file in vector values. The tracked values will be transferred in a 3D Animation software like MAYA and applied on a pre-Rigged 3D model of Human face. The 3D face will be rigged using joints to emulate the face muscle behavior.
Loading PDF...
Loading Statistics...