Week 35

Non-verbal communication

This week was finals week, so it was difficult to think about my project. Though, one of my finals was a presentation for my human robot-interaction class. So, ideas about computers doing things at a human level or alongside humans was definitely on my mind (of course, it’s always on my mind). Actually, the final project was trying to understand how non-verbal communication (gestures) effect comprehension and retention. More specifically, we place a NAO robot in front of participants and observe how the NAO robot gesturing while delivering a story vs the NAO robot delivering a story without gesturing effects the comprehension of a said story. An overview of the project can be viewed here.

This project was particularly interesting because it wasn’t very clear how non-verbal communication contributes to understanding across humans. For example, you’ll find that people who have very expressive body language are often considered to be engaging speakers (whether this engagement actually contributes to understanding is another topic). Perhaps these speakers use of gesturing and other forms of non-verbal communication (facial expression) to mediate their messages. Similarly, in American Sign Language, facial expressions are used to help convey specific meanings. On the other hand, people who have a higher tendency to talk with their hands or face can be distracting and borderline annoying.

So, children acquiring their native tongue not only have the job of understanding words, sentences, but also body language and other non-verbal cues. Understanding words and sentences are chief as there is not necessarily a prescriptive grammar for non-verbal communication. Yet somehow, the role of non-verbal communication helps children communicate with their caretakers.

A review of verbal and non-verbal human–robot interactive communication

Children and Communication: Verbal and Nonverbal Language Development

MORE ABOUT SPEECH, LANGUAGE AND COMMUNICATION

Best,
EO