You are currently browsing the thoughts on thoughts weblog archives for the day 14/06/2010.
M | T | W | T | F | S | S |
---|---|---|---|---|---|---|
« May | Jul » | |||||
1 | 2 | 3 | 4 | 5 | 6 | |
7 | 8 | 9 | 10 | 11 | 12 | 13 |
14 | 15 | 16 | 17 | 18 | 19 | 20 |
21 | 22 | 23 | 24 | 25 | 26 | 27 |
28 | 29 | 30 |
- 04/01/2011: Alien hand
- 01/01/2011: Decline effect
- 29/12/2010: Improving scan results
- 26/12/2010: Probable, true or truthy
- 23/12/2010: The noisy brain
- 20/12/2010: Blog answers
- 16/12/2010: Why is science talking about freewill?
- 14/12/2010: The many faces of Bayesian models
- 11/12/2010: Motor bias
- 08/12/2010: Embodied metaphor
- January 2011
- December 2010
- November 2010
- October 2010
- September 2010
- August 2010
- July 2010
- June 2010
- May 2010
- April 2010
- March 2010
- February 2010
- January 2010
- December 2009
- November 2009
- October 2009
- September 2009
- August 2009
- July 2009
- June 2009
- May 2009
- April 2009
- March 2009
- February 2009
- January 2009
- December 2008
- November 2008
- October 2008
- September 2008
- August 2008
- July 2008
- June 2008
Archive for 14/06/2010
Hearing yourself speak
14/06/2010 by admin.
F. Huettig and R. Hartsuiker have a paper in Language and Cognitive Processes, Listening to yourself is like listening to others: External, but not internal, verbal self-monitoring is based on speech perception. (here) The abstract is below.
Theories of verbal self-monitoring generally assume an internal (pre-articulatory) monitoring channel, but there is debate about whether this channel relies on speech perception or on production-internal mechanisms. Perception-based theories predict that listening to one’s own inner speech has similar behavioural consequences as listening to someone else’s speech. Our experiment therefore registered eye-movements while speakers named objects accompanied by phonologically related or unrelated written words. The data showed that listening to one’s own speech drives eye-movements to phonologically related words, just as listening to someone else’s speech does in perception experiments. The time-course of these eye-movements was very similar to that in other-perception (starting 300 ms post-articulation), which demonstrates that these eye-movements were driven by the perception of overt speech, not inner speech. We conclude that external, but not internal monitoring, is based on speech perception.
This appears quite complex. The paper differentiates between our consciousness of our speech when it is not actually produced aloud and when spoken. The implication is that we produce and monitor our speech but are only consciously aware of the speech until we hear it. However, we become conscious of our internal, unspoken speech in a different way. This makes consciousness simpler but language more complicated. Consciousness is again a question of perception. But as BPS Research Digest puts it:
It’s important to clarify: we definitely do monitor our speech internally. For example, speakers can detect their speech errors even when their vocal utterances are masked by noise. What this new research suggests is that this internal monitoring isn’t done perceptually - we don’t ‘hear’ a pre-release copy of our own utterances. What’s the alternative? Huettig and Hartsuiker said error-checking is somehow built into the speech production system, but they admit: ‘there are presently no elaborated theories of [this] alternative viewpoint.’
Posted in language | 1 Comment »