149
Views
0
CrossRef citations to date
0
Altmetric
Research Articles

Manifold-enhanced CycleGAN for facial expression synthesis

ORCID Icon & ORCID Icon
Pages 181-193 | Received 20 Jan 2022, Accepted 12 Dec 2022, Published online: 20 Feb 2023
 

ABSTRACT

Facial expressions contain a great deal of information in inter-personal communication and social interaction and have a great impact on face recognition. This paper investigates expression synthesis methods based on deep learning and their performance in expression classification and face recognition. We propose a new synthesis method based on the active appearance model and CycleGAN. The eigendecomposition method is used to obtain the expression shape of a target subject through linear operations. An expression intensity coefficient is developed to control the dynamics of generated expressions. Subsequently, a generative model is introduced for the synthesis on textures. The identity loss is integrated into the traditional CycleGAN model, incorporated with the adversarial loss and the cycle consistency loss, to better preserve texture attributes and identity information of target subjects. Experiments and comparisons show that the proposed method offers marked improvements in verification of expressions and identities, as well as in generalization across databases.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Notes on contributors

Yating Huang

Yating Huang received the B.Eng. degree in electronic science and technology from Nanjing University of Science and Technology in 2016, and the MSc. degree in communications and signal processing from The University of Manchester in 2021. She is currently pursuing the PhD degree in Department of Electrical and Electronic Engineering at The University of Manchester. Her current research interests include deep learning, self-supervised learning and medical image analysis.

Hujun Yin

Hujun Yin received his B.Eng. degree in electronic engineering and MSc. degree in signal processing from Southeast University and PhD degree in neural networks from University of York. Since 1996, He has been with Department of Electrical and Electronic Engineering, the University of Manchester, where currently he is a Professor of Artificial Intelligence. He is also the Head of Business Engagement in AI and Data for the Faculty of Science and Engineering. His research areas are AI, machine learning, deep learning, signal/image processing, face recognition, time series modelling, bio-/neuro-informatics, and interdisciplinary applications from medical and plant diagnosis, manufacturing inspection, and power network optimisation to recycling automation. He has published over 200 peer-reviewed articles and supervised twenty-five PhD students. Prof. Yin has received research funding from UK research councils, EPSRC, BBSRC, Innovate UK and industries for over 25 funded projects. Many of his projects involve industries and SMEs in developing cutting-edge AI solutions. He has served or has been serving as an Associate Editor for the IEEE Transactions on Neural Networks, the IEEE Transactions on Cybernetics, and the International Journal of Neural Systems. He has also served as general chair or programme chair for a number of international conferences in AI, machine learning and data analytics. He has been a member of the EPSRC Peer Review College since 2006, a senior member of the IEEE since 2003, and a Turing Fellow of the Alan Turing Institute since 2018.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.