A groundbreaking technology for real-time human emotion recognition has been developed by researchers at the Ulsan National Institute of Science and Technology (UNIST). The system, which utilizes a multi-modal approach that combines verbal and non-verbal expression data, has the potential to revolutionize various industries, particularly in the realm of wearable systems.
The team was able to address the challenge of interpreting abstract emotional data by creating a personalized skin-integrated facial interface (PSiFI) powered by friction charging. The PSiFI incorporates a bidirectional triboelectric strain and vibration sensor for simultaneous data sensing and integration. This ensures wireless real-time emotion recognition, even when individuals are wearing masks.
The fully integrated data processing circuit enables the system to show impressive accuracy in identifying human emotions. As demonstrated in a digital concierge application within virtual reality (VR) environments, the technology offers personalized services based on users’ emotions. This opens up possibilities for portable emotion recognition devices and next-generation digital platform services.
Collaboration with Nanyang Technical University in Singapore, supported by the National Research Foundation of Korea (NRF) and the Korea Institute of Materials (KIMS), underscores the importance of this advancement in human-machine interface (HMI) devices. The development paves the way for enhanced interaction capabilities between humans and machines.
In conclusion, UNIST researchers have successfully introduced a cutting-edge technology for real-time human emotion recognition that has significant implications for various industries with a focus on wearable systems. With its multi-modal approach, personalized services based on users’ emotions, and wireless real-time capabilities, this technology is poised to transform HMI devices and create new opportunities for digital platform services.