Abstract
In diurnal life, there are circumstances in which communication without words is required, emoticons or facial expressions are used to exchange virtual dialogues. This genre of communication completely injects into situation(emergency), where the sole way to interconnect is by performing some unusual activities, eye blinks or through some emotions. The paper proposes a Sagacious Information Recuperation Technique (SIRT) for sensing an emergency by capturing intelligent information from behavioral (emotions) and biological (eye blinks) changes. The proposed technique receives input from real-time human attributes: eye blinks and emotions. One of the parameter of proposed SIRT is further validated through an implementation of a real time facial expression recognition system(RTFER) that effectively identifies various emotions in real time scenario. LBP features collected from training dataset (cohn-kanade) are further classified using Support Vector Machine (SVM), Multilayer Perceptron(MP) and a Voting system involving SVM, MP and ADAB(ADABoost) classification algorithms. Experiment on Cohn-Kanade (CK) dataset illustrate that Voting system with multiple classifiers outperform methods using single descriptors. Receiver Operating Characteristics (ROC), area under the curve, sensitivity, fall-out, precision, and f-measure are used to show the performance of the system qualitatively and quantitatively. Accuracy with the proposed Voting system with reduced attribute dataset is increased to 81.82% in comparison with existing support vector machine (75.38%) and multilayer perceptron (78.41%).
Keywords: