Abstract
Real-time emotion detection is a crucial aspect of media, allowing for empathetic and tailored responses to users’ emotions. Customized interaction analyzing data to understand individual user needs and adjust interactions accordingly. Real-time emotion detection can face accuracy issues due to varying individual expressions and contexts, while customized interaction analysis can be limited by data privacy concerns and data integration challenges. The research introduces a novel real-time emotion detection model named E-IS to monitor the emotional changes and interactions of users when viewing broadcast hosting services. This service provides live or pre-recorded content to visitors through various platforms. Early on, audience facial expressions and interaction records are collected to detect audience emotions and analyze personalized interactions. The image resizing, noise removal, removal of stop words, stemming, and emoji and symbol handling processes are employed for data pre-processing. The EN-CNN is for emotion detection, and AIDT is for analyzing customized interaction using the normalized data. These prediction models are deployed in E-IS using Python. E-IS integrates with the broadcast hosting communication network hardware and software components. The results demonstrate that the E-IS achieved the highest F1-score (96.70%), recall (95%), precision (97%), accuracy (99%), user satisfaction score (95%), and engagement level (92%). The E-IS enhances broadcast hosting services. Additionally, it effectively reduced the emotion detection latency, response time for interaction adjustment, false positive rate in emotion detection, and data processing time per interaction.
Keywords
Get full access to this article
View all access options for this article.
