Music And Empathetic Speech in Robots Could Combat Loneliness

robots

Insider Brief

  • A study led by researchers at The Hong Kong Polytechnic University found that combining music with empathetic speech in AI-powered robots significantly increased users’ perception of empathy, suggesting a stronger emotional bond between humans and machines.
  • In experiments with Cantonese-speaking participants over multiple sessions, robots that used both music and speech were rated as more lifelike and socially present, though the effect of music diminished over time without personalization.
  • The findings, published in ACM Transactions on Human-Robot Interaction, highlight the importance of adaptive, multimodal design for social robots intended for mental health support, elder care and education.
  • Image: Prof. Johan HOORN, Interfaculty Full Professor of Social Robotics of the School of Design and the Department of Computing at PolyU. (Polyu)

PRESS RELEASE — Loneliness has a critical impact on the mental health of citizens, particularly among the elderly. Robots capable of perceiving and responding to human emotions can serve as heart-warming companions to help lift the spirits. A research team at The Hong Kong Polytechnic University (PolyU) has discovered that the combined power of music and empathetic speech in robots with artificial intelligence (AI) could foster a stronger bond between humans and machines. These findings underscore the importance of a multimodal approach in designing empathetic robots, offering significant implications for their application in health support, elder care, education and beyond.

The research project, A Talking Musical Robot over Multiple Interactions, was led by Prof. Johan HOORN, Interfaculty Full Professor of Social Robotics of the School of Design and the Department of Computing at PolyU, in collaboration with Dr Ivy HUANG at The Chinese University of Hong Kong. The study investigated how music and empathetic speech could enhance the emotional resonance of on-screen robots, revealing that music can act as a powerful adjunct to empathetic speech.

As part of the study, the team examined how Cantonese-speaking participants interacted with empathetic robots across three interactive sessions. The findings showed that combining music and speech significantly increased the participants’ perceived empathy of the machines.

“Our data indicate that the presence of music continued to enhance the robot’s resemblance to humans in later sessions,” explained Prof. Hoorn. “One interpretation is that music made the interaction feel more like a real conversation with a personality, something human counsellors might do by playing music to comfort their clients, which in turn made the robot seem more lifelike or socially present.”

However, the research pointed out that the impact of music could diminish over time when the participants became attuned to the music after repeated sessions, highlighting the importance of tailoring interaction strategies to individual users’ needs to sustain effective human-robot interaction. The study suggested that empathetic robots should be designed to adapt their responses to user feedback and context, for example, by adjusting various musical elements or gradually personalising dialogue content to maintain sustained relevance of empathy.

Prof. Hoorn emphasised: “Our research points to the significance of multimodal communication encompassing music, speech and more through empathetic robots. It holds considerable promise for application in real-world settings, particularly in the fields of mental health support and elderly care. The integration of empathetic robots capable of delivering tailored musical experiences and engaging in sensitive conversation could provide meaningful companionship and emotional support to individuals who may experience loneliness or social isolation.”

Prof. Hoorn is leading another project, “Social Robots with Embedded Large Language Models Releasing Stress among the Hong Kong Population”, which has received funding of over HK$40 million from the Research Grants Council Theme-based Research Scheme.

Concurrently serving as Associate Director of the PolyU Research Institute for Quantum Technology, Prof. Hoorn is set to explore quantum-inspired models of human affect to better capture and respond to the inherent vagueness and ambiguity of emotional experience. Unlike traditional computational systems that struggle with the fluid and context-dependent nature of affective responses, quantum models can represent emotional states as probabilistic superpositions, reflecting the genuine uncertainty and complexity of human feelings.

“What excites me the most is the possibility of developing social robots that not only recognise the complexity of human affect but also embrace it. These robots could offer support that is adaptable, open-ended and compassionate, similar to the individuals they are designed to help,” added Prof. Hoorn.

The study has been published in ACM Transactions on Human-Robot Interaction, a leading peer-reviewed interdisciplinary journal in the field.

Need Deeper Intelligence on the AI Market?

AI Insider's Market Intelligence platform tracks funding rounds, competitive landscapes, and technology trends across the global AI ecosystem in real time. Get the data and insights your organization needs to make informed decisions.

Related Articles

OpenAI Acquires TBPN to Expand AI Media and Communications Strategy

OpenAI has acquired Technology Business Programming Network (TBPN), marking its first acquisition of a media company as it looks to expand how artificial intelligence is

Microsoft AI Launches Multimodal Foundation Models to Expand In-House AI Capabilities

Microsoft AI has announced the release of three new multimodal foundation models designed to generate text, voice, and images, marking a continued expansion of its

VerbaFlo Announces $7M in Funding to Expand AI Leasing and Communications Platform for Student Housing and Multifamily Operators

Insider Brief PRESS RELEASE — VerbaFlo, an AI communications platform built for student housing and multifamily operators, announced it has raised a $7 million seed

Stay Updated with AI Insider

Get the latest AI funding news, market intelligence, and industry insights delivered to your inbox weekly.

Subscribe today for the latest news about the AI landscape