Facial expression plays a key role in non-verbal communication, which occurs due to the inner feelings of a person, reflected on the faces. People express their emotional state with facial expressions. Human emotions are mental states of feelings that occur spontaneously and are accompanied by physiological changes in the facial muscles, which implies facial expressions. Some of the critical emotions are surprise, happiness, sadness, fear, anger, disgust, etc. Recognizing a person’s emotional state by his / her facial expressions is a very difficult task. For computer modeling of human emotions, a lot of research has been done. However, recognition of facial expressions remains a complex and interesting problem in computer vision, which is still far from the human vision system. We will consider an emotion detection system designed to automatically recognize basic emotional states by the expression on a person’s face. The system initially analyzes the facial image, locates and measures the characteristic deformations of the person’s face, such as the eyes, eyebrows and mouth, and identifies the correct facial features. The multilayer neural network is then used to classify facial expressions according to their respective emotional states.

Human emotions are an inevitable part of any interpersonal communication. They can be expressed in many different forms. Facial expression gives us an idea of ​​a person’s condition and allows us to have a conversation with another person, depending on his mood.

Over the past few years, the need to identify human emotions has grown significantly. Interest was shown in recognizing human emotions in various fields, including the human-computer interface, animation, medicine, and security. By using the right tools, any signs preceding or following them can be detected and recognized. The main goal is to find out the standardized origin of several emotional states (sadness, happiness, disgust, anger, surprise, and fear) on the face. An emotion of maximum origin is projected as the received emotion onto a specified face. Emotion recognition can be performed using various functions, such as face, speech, and text. Among these features, facial expressions are among the most popular for a number of reasons; they are visible, they contain many useful functions for recognizing emotions, and it is easier to collect a large set of facial data.

Artificial intelligence for facial emotion recognition

Facial recognition enables machines to automatically evaluate the emotional content of a human face. They help in various cognitive tasks and indicate that expressions provide the most natural and powerful means for conveying human emotions, opinions and intentions. Recognition of facial expressions is a difficult task due to the similarity of actions, many head poses, etc. To achieve such a difficult task and to parse the classification of images, a huge and reliable preparation is required.

The purpose of work is to classify emotions on a person’s face as sad, angry, happy, surprised, fear, disgust, and neutral. To classify a person’s facial expression according to relevant emotional categories, it is necessary to find and highlight important facial features that contribute to the identification of expression emotions. A facial emotion recognition system designed to determine the emotional state of a person’s facial expressions. Initially, the system analyzes the image of the face, locates and measures the characteristic deformations of the face and characteristics, such as eyes, eyebrows and mouth. Then each part of the face is analyzed deeper and its features are extracted. Features are presented as information vectors. The classification of vector features according to the corresponding expression emotions is carried out by a multilayer neural network, which is trained and used to classify facial expressions according to the corresponding emotional category.

The work flow chart of the facial expression recognition system
The work flow chart of the facial expression recognition system

A deep learning model is used for emotional recognition to classify facial emotions by image. Deep traditional networks have the ability to simply process spatial images. Due to the size and variety of data sets, a deep neural network is the most suitable technique in everything. The performance of a neural network mainly depends on many problems, such as initial random weights, the activation function used, training data and the number of hidden layers and the network structure of the system. Convolutional neural networks use images directly as input. As a replacement for the intermediate elements created manually, convolutional neural networks are used to mechanically study the order of separation of objects, which can later be used for classification. When using a convolutional neural network, even a network with several levels can achieve a very high degree of accuracy.

Scientific papers  for further study on applications artificial intelligence for facial emotion recognition:

Deep-Emotion: Facial Expression Recognition Using Attentional Convolutional Network

Extended deep neural network for facial emotion recognition

Facial Emotion Analysis using Deep Convolution Neural Network

Recognizing Emotions from Facial Expressions Using Neural Network