Facial expressions play a crucial role in human communication by conveying the intentions of others. With the advancement in computer visiontasks and the use of Deep Neural Networks (DNNs), Facial Expression Recognition(FER) has shown significant progress. When provided with only a face, identifying emotions becomes possible through facial expressions. Automatic facial expression recognition has gained substantial attention in recentdecades, as it has significant potential for both academic and commercialpurposes.
This technology can be super helpful in many ways. For example, mental health analysis. It can also make video games and movies morefun by changing how characters react based on the player's emotions. It can beused in advertising and marketing research and so on.
The dataset utilized for this project comprises imagescapturing individuals exhibiting seven distinct emotions: anger, contempt,disgust, fear, happiness, sadness, and surprise. Each image represents one ofthese specific emotions to explore and develop models for emotion recognitionand analysis.
We begin by loading and preprocessing the images, convertingthem into a format suitable for training our model. With the data prepared, we construct a Neural Network model. The model comprises layers designed to extract features from the input images and classify them into emotion categories. The model is trained on a portion of the dataset, with the remaining portion reserved for evaluation. Once trained, we evaluate the model's performance on the test dataset to assess its ability to generalize to unseen data.