Hungarian study: People understand robots that use emotionally suggestive sounds instead of words

People can understand robots that use emotionally suggestive sounds rather than words, according to a joint study by the HUN-REN-ELTE Comparative Ethology Research Group and Debrecen University.

In the study published in the scientific journal Scientific Reports, artificially generated sounds based on vocalisations that animals and people are known to make and react to in terms of an invitation to approach or a warning to back off were used.

Volunteers submitted their reactions as part of an online game, indicating whether they would approach or withdraw when various artificial sounds of varying lengths and at different frequencies and added acoustic features sounded.

“We’ve generated the sounds modelled on humans and animals expressing emotions from simple beeps to biologically more complex ones,” the statement quoted Márta Gácsi, leader of the HUN-REN-ELTE reach team, as saying.

The results showed that the response to short, artificially generated sounds was one of the approaches, while louder sounds denoted avoidance, regardless of their complexity. Various permutations of this general principle were also observed.

The researchers concluded that “robots in the human environment can be fitted with a set of acoustic signals that effectively help communication without speech”.

“In situations where linguistic interaction is not required the operation of social robots can become simpler and independent of culture and language,” Beáta Korcsok, one of the authors of the study, a researcher of the HUN-REN-ELTE Comparative Ethology Research Group, said.

Read also:

Featured image: depositphotos.com

Leave a Reply

Your email address will not be published. Required fields are marked *