why do people shallow process?
need for system tolerance
language we encounter everyday so full of ambiguities and contradictions that we need to be able to get the gist of a message even when its delivered badly
even though the context of the speech, when fully analysed, doesn’t really make sense, can generally still follow what the speaker is trying to communicate with us as have some tolerance in language processing system
if was the case that always analysed language input fully and deeply during processing, would struggle to interpret language input like this and with the ambiguity and mistakes that are common in everyday speech
what are the traditional models and assumptions?
based on assumption that overarching goal of language comprehension system is to deliver accurate and detailed representation of speaker’s or writer’s utterance
full lexical retrieval and integration into a fully specified syntactic structure
fully analyse meaning of every word and integrate it into a fully specific grammatical structure
we process each word fully and completely as we can before moving one (process language fully and deeply)
what did Just & Carpenter (1980) say on incremental interpretation?
“readers interpret a word while they are fixating on it and they continue to fixate it until they have processed it as far as they can”
what did MacDonald, Pearlmutter & Seidenberg (1994) say?
acknowledged “the communicative goal of the listener can be achieved with only a partial analysis of the sentence”
however viewed these as “degenerate cases”
what is incomplete semantic commitment?
Mary bought a brand-new Hitachi Radio. It was in Selfridge’s window
what is it? the particular radio that she bought so after she bought it it was gone? the type of radio so after she bought it it was still there?
Later, when Joan saw it, she decided too that it would be a good purchase
when reading the sentence, we may not be fully committing to exactly what the word refers to - don’t need to to get the gist
what are garden path sentences?
lingering incorrect interpretations
while Anna dressed the baby played in the crib - did the baby play in the crib? did Anna dress the baby?
correct answers to first question (suggests that people do successfully reanalyse the sentence) but people will also frequently incorrectly answer yes to second question (although have successfully reanalysed sentence, have not “thrown away” initial misinterpretation)
expect one interpretation
once interpretation “good enough” people don’t both clearing up details
what is pragmatic normalisation?
misinterpretation of passive sentences
Ferreira (2003)
breakdown of local semantic interpretation because of pragmatic override
participants given active or passive sentences and asked to indicate “who is the do-er?” in each case
active sentences - the dog bit the man (99% accurate), the man bit the dog (99% accurate) - simple syntactic structure so generally correctly
passive sentence - the dog was bitten by the man (74% accurate), the man was bitten by the dog (88% accurate) - more complicated sentence structure, less likely to make correct responses in cases where correct answer wouldn’t fit world knowledge
once correct and complete language processing gets more difficult, readers have tendency to rely on their world knowledge in order to interpret likely sentence meaning rather than performing full syntactic analysis
people will use pragmatics (world knowledge) to “normalise” meaning of sentence even if not correct
what is the survivors problem (Barton & Sanford, 1993)?
presented with story and asked question “where should survivors be buried”
scenario-relevant words - survive, injure, wound, maim
relevant core meaning may aid detection
higher detection rate for survivors, lowest for injured
words that “fit the context” may be processed less deeply
what are some of the non-detectors answers in the survivors problem (Barton & Sanford, 1993)?
ask relatives of deceased where they would prefer them to be buried
ship all bodies home
contact next of kin of dead
what was the influence of the scenario in the survivors problem (Barton & Sanford, 1993)?
air crash - 33% detection rate
bicycle crash - 80% detection rate
what are easy-to-detect anomalies?
anomalous word that does not fit well within the context
what are hard-to-detect anomalies?
anomalous word does fit well in context
what is the shallow processing hypothesis explanation for missed anomalies?
anomalies not detected because full meaning of anomalous words not retrieved and/or integrated with representation of the discourse
what is the reduced awareness hypothesis explanation for missed anomalies?
comprehension system retrieves the meaning of the anomalies and attempts to integrate the semantics of the word in question with the rest of the text
however, for some reason, fact of the anomaly may not reach conscious awareness
registered somewhere in the system but does not filter through to full conscious awareness and therefore cannot be reported
what was Bohan & Sanford’s (2008) study into eye tracking anomaly detection?
monitored people’s eye movements as they read sentences containing hard-to-detect anomalies - text divided up for analysis, indicate if noticed anything wrong
no effect in first-pass reading times on hostages
longer reading times on hostages when anomaly detected
no difference in reading times in cases where there was an anomaly that went unreported compared to non anomalous condition
suggests no evidence for unconscious registration of anomaly but evidence for shallow processing hypothesis
detection not immediate but slightly delayed, results in severe disruption
disruption only observed when anomalies consciously detected - no evidence for unconscious detection
possible that eye-tracking is not a sensitive enough measure of unconscious registration
what was the study into anomaly detection and ERPs?
hard-to-detect or easy-to-detect
easy-to-detect results - clear N400 effect, bigger N400 means detect anomaly, readers had big N400
hard-to-detect results - no N400, late positive potential effect but only when actually detected
what factors influence depth of processing?
logical subordination
linguistic focus
discourse focus
attention-grabbing devices
what was Baker & Wagner’s (1987) study into logical subordination?
clearly distinguishes focal information from “extra” information
false information less likely to be detected when presented as logically subordinate rather than of central importance - presented in subordinate not main clause
people process information less deeply when it is part of subordinate clause
what is cleft construction?
use of phrase “it was X” in order to answer an implied question
what was Bredart & Modolo’s (1988) study into linguistic focus?
focus is on Moses - it was Moses who took two animals of each kind into the Ark, cleft construction clearly answers question “who took animals on the Ark?”
focus is on two animals - it was two animals of each kind that Moses took on the Ark, cleft construction answers question “what was taken on the Ark?”, draws attention away from Moses
what was Sturt, Sanford, Steward & Dawydiak’s (2004) study into linguistic focus?
text-change detection
participants presented with piece of text that they should just read normally then presented with text again with a word that might have been changed
people should be more likely to spot a change if they have processed the text, or the target word at least, more deeply
word placed in linguistic focus or not
putting a word in focus makes you process it more deeply
what is discourse focus?
question set up by the text
what was Sturt et al’s (2004) study into discourse focus?
manipulated arrangements of words in target sentence
focused - critical information placed in focus by preceding discourse
unfocused - man and what he was wearing no longer particularly in focus
when key word changes to unrelated, noticed whether in focus or not
when change more subtle, more likely to spot if key word in focus so processed more deeply when in focus
what is text-change detection in discourse focus?
aims to discover when distinctions are not being made at some level of semantics
based on “granularity hypothesis” - refers to fineness of detail there is in a representation
focus increases the probability of detecting a change to a related word
suggests that information in focus is represented at a finer level of detail