202
Views
14
CrossRef citations to date
0
Altmetric
Articles

Automatic facial recognition using VGG16 based transfer learning model

&
 

Abstract

Detecting and analyzing emotions from human facial movement is a problem defined and developed over many years for the benefits it brings. Facial expression is a crux of the human-computer interaction (HCI) research area. Researchers are exploring its application in security, medical science, and to know the behavior of a person or community. In this paper, we have proposed a deep learning-based framework using transfer learning for facial expression. This approach uses the existing VGG16 model for a modified trained model and concatenates additional layers on it. VGG16 model is already trained on ImageNet, which has 1000 classes. After this, the model has been verified on CK+, JAFFE benchmark datasets. Extended Cohn-Kanade (CK+) and Japanese Female Facial Expression (JAFFE) are popular Facial Expression Dataset. The proposed model has shown 94.8% accuracy on CK+ and 93.7% on the JAFFE dataset and found superior to existing techniques. We have implemented proposed technique on Google Colab-GPU that has helped us to process these data.

Subject Classification:

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.