by Spencer D. Kelly; Tara McDevitt; and Megan Esch
Abstracts
Recent research in psychology and neuroscience has demonstrated that co-speech gestures are semantically integrated with speech during language comprehension and development. The present study explored whether gestures also play a role in language learning in adults. In Experiment 1, we exposed adults to a brief training session presenting novel Japanese verbs with and without hand gestures. Three sets of memory tests (at five minutes, two days and one week) showed that the greatest word learning occurred when gestures conveyed redundant imagistic information to speech. Experiment 2 was a preliminary investigation into possible neural correlates for such learning. We exposed participants to similar training sessions over three days and then measured event-related potentials (ERPs) to words learned with and without co-speech gestures. The main finding was that words learned with gesture produced a larger Late Positive Complex (indexing recollection) in bi-lateral parietal sites than words learned without gesture. However, there was no significant difference between the two conditions for the N400 component (indexing familiarity). The results have implications for pedagogical practices in foreign language instruction and theories of gesture-speech integration.This one is from the "Who would've guessed" dept. It turns out that whether an instructor performs certain gesture types while teaching new words can have an affect on the students' retention of that word.
I've said it before and I'll say it again: Gesture is a critical component of language and it belongs in our unification theory.
But I don't really understand what the point of the ERP study was... Looking forward to the followup article.
No comments:
Post a Comment