33 neural organization of language: clues from sign ...lcn.salk.edu/publications/2009 2010/hicbell...

22
685 33 Neural Organization of Language: Clues From Sign Language Aphasia Gregory Hickok and Ursula Bellugi A central issue in understanding the neural organization of language is the extent to which this organization is dependent on the sensory and motor modali- ties through which language is perceived and produced. There are many reasons to think that the neural organization of language should be profoundly influ- enced by extrinsic factors in development such as sensory and motor experi- ence. The temporal processing demands imposed by the auditory system have been argued to favor left hemisphere systems (Tallal, Miller, & Fitch, 1993), which could, in turn, determine aspects of the lateralization pattern of audito- ry-mediated language. Superior temporal lobe regions thought to be important for language comprehension are situated in and around auditory cortices—a natural location given auditory sensory input of language. Likewise, Broca’s area, which is classically thought to play a role in speech production, is situated just anterior to motor cortex controlling the speech articulators. Thus, it would not be unreasonable to hypothesize that the neural organization of language— including its lateralization and within hemisphere organization—is determined in large part by the particular demands imposed by the sensory and motor inter- face systems. By studying the functional neuroanatomy of the signed language of the deaf, we can test this hypothesis in a straightforward manner. It has been shown that signed languages share much of the formal linguistic structure found in spoken languages, but differ radically in the sensory and motor systems through which language is transmitted (Bellugi, Poizner, & Klima, 1989; Emmorey, 2002; Klima & Bellugi, 1979). In essence, signed language offers a kind of natural experi- mental manipulation: central linguistic structure and function are held constant, while peripheral sensory and motor experience is varied. Thus, a comparison of the neural organization of signed versus spoken language will provide clues concerning factors that drive the development of the functional neuroanatomy of language. Here we provide a summary of the findings from studies of the neural organization of the signed languages of the deaf. Y108978_C033.indd 685 4/14/10 12:23:21 AM

Upload: hoanglien

Post on 21-Jul-2018

219 views

Category:

Documents


5 download

TRANSCRIPT

685

33 Neural Organization of Language: Clues From Sign Language Aphasia

Gregory Hickok and Ursula Bellugi

A central issue in understanding the neural organization of language is the extent to which this organization is dependent on the sensory and motor modali-ties through which language is perceived and produced. There are many reasons to think that the neural organization of language should be profoundly influ-enced by extrinsic factors in development such as sensory and motor experi-ence. The temporal processing demands imposed by the auditory system have been argued to favor left hemisphere systems (Tallal, Miller, & Fitch, 1993), which could, in turn, determine aspects of the lateralization pattern of audito-ry-mediated language. Superior temporal lobe regions thought to be important for language comprehension are situated in and around auditory cortices—a natural location given auditory sensory input of language. Likewise, Broca’s area, which is classically thought to play a role in speech production, is situated just anterior to motor cortex controlling the speech articulators. Thus, it would not be unreasonable to hypothesize that the neural organization of language—including its lateralization and within hemisphere organization—is determined in large part by the particular demands imposed by the sensory and motor inter-face systems.

By studying the functional neuroanatomy of the signed language of the deaf, we can test this hypothesis in a straightforward manner. It has been shown that signed languages share much of the formal linguistic structure found in spoken languages, but differ radically in the sensory and motor systems through which language is transmitted (Bellugi, Poizner, & Klima, 1989; Emmorey, 2002; Klima & Bellugi, 1979). In essence, signed language offers a kind of natural experi-mental manipulation: central linguistic structure and function are held constant, while peripheral sensory and motor experience is varied. Thus, a comparison of the neural organization of signed versus spoken language will provide clues concerning factors that drive the development of the functional neuroanatomy of language. Here we provide a summary of the findings from studies of the neural organization of the signed languages of the deaf.

Y108978_C033.indd 685 4/14/10 12:23:21 AM

686 The Handbook of Psycholinguistic and Cognitive Processes

HEMISPHERIC ASYMMETRIES FOR SIGN LANGUAGE PROCESSING

Left hemisphere damage (LHD) in hearing/speaking individuals is associated with deficits at sublexical (“phonetic/phonemic”), lexical, and sentence levels, both in production and in comprehension to some degree (Damasio, 1992; Goodglass, 1993; Hillis, 2007). Right hemisphere damage (RHD), on the other hand, has been associated with supra-sentential (e.g., discourse) deficits (Brownell, Potter, Bihrle, & Gardner, 1986). Given the radical differences in modality of perception and production of sign language, one might expect sign language to differ dra-matically in hemispheric asymmetries. But instead our studies have found very strong evidence of highly similar patterns of hemispheric asymmetries in the deaf signing population compared to hearing/speaking individuals.

Sublexical-, lexical-, and Sentence-level ProceSSeS

A variety of sublexical-, lexical-, and sentence-level deficits have been found in individual LHD deaf signers (Bellugi et al., 1989; Hickok & Bellugi, 2001; Hickok, Kritchevsky, Bellugi, & Klima, 1996b; Poizner, Klima, & Bellugi, 1987). These deficits have been noted both in production, and to varying degrees in comprehension. In production, a range of paraphasic error types have been identified in LHD signers, including “phonemic,” morphological, and semantic subtypes, demonstrating the breakdown of these various levels of computation. Some examples of phonemic paraphasias are shown in Figure 33.1. Disorders in sign language sentence formation in LHD signers have emerged both in the form of agrammatism, a tendency to omit grammatical markers, and in the form of paragrammatism, a tendency toward grammatically rich but error prone utter-ances, showing that sentence-level computations can also be disrupted following LHD in deaf signers (Hickok & Bellugi, 2001; Hickok, Bellugi, & Klima, 1998a). Production errors at all these levels are fairly common in LHD signers, but occur relatively rarely in RHD signers. In our sample of over 50 unilateral brain injured signers, we have found only one RHD signer who was in fact aphasic on stan-dard clinical-type assessment; this individual was left handed and had evidence of a reversed dominance pattern (Pickell et al., 2005). As for comprehension, we have documented deficits at the word (i.e., sign) and sentence level (Hickok, Klima, & Bellugi, 1996a; Hickok, Love-Geffen, & Klima, 2002). At the word level, comprehension deficits have been observed only following LHD, not RHD. These deficits are relatively mild and appear to result from breakdowns primar-ily at the postphonemic level (Hickok, Klima, Kritchevsky, & Bellugi, 1995). At the sentence level, the most severe deficits occur following LHD, but, consistent with findings in the hearing/speaking population, some difficulties in sentence comprehension can be found in RHD signers, particularly as sentence complexity increases (Hickok et al., 2002).

Group studies confirmed the generalizability of these case study observa-tions. In an analysis of 13 LHD and 10 RHD signers, one study (Hickok et al.,

Y108978_C033.indd 686 4/14/10 12:23:21 AM

Neural Organization of Language: Clues From Sign Language Aphasia 687

1996a) found that LHD signers performed significantly worse than RHD sign-ers on a range of standard language measures, including production, comprehen-sion, naming, and repetition (Figure 33.2). These differences between LHD and RHD signers could not be attributed to variables such as: (a) age at test; (b) onset of deafness; or (c) age of exposure to ASL, as these variables did not correlate with any of the language behaviors tested. Further reinforcing this finding, it was found that LHD versus RHD group differences showed the same patterns if only native signers were included in the analysis (Figure 33.2; Hickok et al., 2002). This is not to say that these variables have no impact on sign language organiza-tion or language ability, because surely they do at some level of detail (Neville et al., 1997; Newport & Meier, 1985), only that the dominant factor that predicts performance on these aphasia assessment measures is whether the left or right hemisphere is damaged.

Fine

Correct Handshape error

Frog

Correct Location error

FIGURE 33.1 Examples of sign paraphasic errors in left hemisphere damaged deaf signers.

Y108978_C033.indd 687 4/14/10 12:23:21 AM

688 The Handbook of Psycholinguistic and Cognitive Processes

0 0102030405060708090100

ASL

pro

duct

ion

scal

es

ASL

com

preh

ensio

n te

sts

5101520

Total score Percent correct

0102030405060708090100

Percent correct

0246810121416 Number correct

2535 30

0A

SL p

arap

hasia

s/m

in

ASL

nam

ing

test

sA

SL p

hras

e rep

etiti

on te

st

0.3

0.6

0.9

1.2

Number errors/min

1.5

2.1

1.8

LHD

sign

ers

RHD

sign

ers

FIG

UR

E 33

.2

Perf

orm

ance

of

LH

D a

nd R

HD

sig

ners

on

a ra

nge

of s

tand

ard

apha

sia

asse

ssm

ents

inc

ludi

ng p

rodu

ctio

n, n

amin

g, r

epet

itio

n, a

nd

com

preh

ensi

on. C

ircl

es in

dica

te a

vera

ge v

alue

s fo

r th

e su

bset

of

the

lesi

on p

opul

atio

n th

at in

clud

es o

nly

nativ

e de

af s

igne

rs.

Y108978_C033.indd 688 4/14/10 12:23:22 AM

Neural Organization of Language: Clues From Sign Language Aphasia 689

SuPra-Sentential (diScourSe) deficitS

One linguistic deficit associated with right hemisphere damage in hearing/speak-ing individuals involves discourse-level processes, for example, the ability to appropriately link (in production and comprehension) discourse referents across multiple sentences (Brownell & Stringfellow, 1999; Brownell et al., 2000; Wapner et al., 1981). These deficits manifest as failures to integrate information across sentences, including impairments in understanding jokes, in making inferences, and in adhering to the story line when producing a narrative. In contrast, phono-logical and syntactic processes in these hearing/speaking individuals appear to be intact. Using a story narration task given to two deaf RHD signers, two dis-tinct types of discourse deficits have been documented (Hickok et al., 1999). The first involves a failure to adhere to the story line, evidenced by confabulatory or tangential utterances. The second type of deficit involves errors in the use of the spatialized discourse of ASL. Discourse organization in ASL is unique in that dis-course referents are established, referred to, and manipulated in a plane of signing space, and it was the ability to use this spatial mechanism in a discourse that was disrupted in one of the patients studied. These results suggest: (a) the right hemi-sphere is involved in discourse processing in ASL, as it is in spoken language; and (b) there are dissociable subcomponents of discourse processes in ASL.

claSSifier conStructionS

American Sign Language contains an interesting type of construction, classifier signs, which is not typically assessed on standard aphasia exams in signed or spo-ken language. Classifier signs are complex forms that can be used to specify a range of spatial information, relative location, movement path, movement manner, object size and shape (see Emmorey, 2002 for a review). These forms are typically comprised of two parts: (1) a handshape configuration, where different handshapes can correspond to different semantic classes of object referents (e.g., people, vehi-cles, etc.), or to object shape-related properties (e.g., flat, narrow, etc.); and (2) a specification of the location or movement of the referent, denoted by the location/movement of the hand(s) in signing space. The linguistic status of classifier forms is a matter of debate, but what is clear is that they differ from canonical lexical signs in that they can encode information, for example spatial information, noncategori-cally. Whereas a lexical sign like DRIVE-TO can indicate that a vehicle was driven from one place to another, a classifier sign can convey more detailed information about the route traversed (e.g., it was curvy and uphill; Figure 33.3, top).

It has been suggested that the right hemisphere may be more involved in processing classifier signs than in processing canonical lexical signs (Hickok, Bellugi, & Klima, 1998b; Poizner et al., 1987), and a handful of recent functional imaging studies have provided some evidence supporting this idea. For example, a PET study by Emmorey et al. (2002), found that deaf native signers activated parietal regions bilaterally when describing spatial relations using ASL classifier signs (compared to naming objects). See Campbell and Woll (2003), for addi-tional discussion.

AU:Please supply a reference with full publication information for Brownell & Stringfellow, 1999 or delete this citation.

AU:Please supply a reference with full publication information for Brownell et al., 2000 or delete this citation.

AU:Please add a refer-ence with full publication information for Wapner et al., 1981 or delete this citation.

Y108978_C033.indd 689 4/14/10 12:23:22 AM

690 The Handbook of Psycholinguistic and Cognitive Processes

A study of ASL production using a narrative task in 21 unilateral brain injured signers (13 LHD) reported that RHD signers made relatively few lexical errors but a substantial number of classifier errors. LHD signers made both lexical and clas-sifier errors in roughly equal proportions (Figure 33.3, bottom). The source of the classifier errors is not clear. For example, it could be that classifier errors in RHD patients are caused by a fundamentally different deficit (e.g., some nonlinguistic spatial deficit) than those in LHD patients (e.g., linguistic form selection). What is clear is that the production of ASL classifier forms is supported to some extent both by the left and right hemispheres, whereas lexical sign production is under the control of predominantly left hemisphere systems.

HEMISPHERIC ASYMMETRIES FOR SPATIAL COGNITION

It appears that language functions have a similar hemispheric organization in deaf signers comparing hearing/speaking individuals. But what about nonlinguis-tic spatial functions? Might these abilities be differently organized in the brain of deaf signers? Available evidence suggests that the answer is no and that the lateralization pattern of nonlinguistic spatial functions is also similar between deaf and hearing people.

GroSS viSuoSPatial deficitS in rHd SiGnerS

RHD in hearing people often leads to substantial visuospatial deficits evidenced, in the most severe cases, by grossly distorted productions in drawing tasks and

FIGURE 33.3 An example of classifier form use showing a “vehicle” handshape moving upward along a curvy path (top). Proportion of lexical versus classifier sign errors in RHD and LHD signers (Hickok, Pickell, Klima, & Bellugi, in press).

Y108978_C033.indd 690 4/14/10 12:23:22 AM

Neural Organization of Language: Clues From Sign Language Aphasia 691

block arrangement tasks (Kirk & Kertesz, 1994). Deaf RHD signers can have similar kinds of deficits (Figure 33.4). Despite sometimes severe nonlinguistic visuospatial impairments, none of the RHD signers had aphasia (Bellugi et al., 1989; Hickok et al., 1998a; Hickok et al., 1996a; Poizner et al., 1987).

local/Global differenceS

While gross visuospatial deficits may more commonly occur with RHD (both in deaf and hearing populations), it has been reported that some visuospatial deficits can be reliably observed in LHD hearing individuals (Delis, Kiefner, & Fridlund, 1988; Kirk & Kertesz, 1994). When LHD individuals have visuospatial deficits, they typically involve difficulties in attending to and/or reproducing the local-level details of a visuospatial stimulus, while global-configuration aspects are correctly identified/reproduced. RHD individuals tend to show the opposite pattern. Thus, it has been suggested that the left hemisphere is important for local-level visuospatial processes, whereas the right hemisphere is important for global-level processes (Delis et al., 1988). Does a similar asymmetry hold for the deaf signing population? To answer this question a group of left or right lesioned deaf signers were asked to reproduce: (1) two line drawings (a house and an elephant); and (2) four hierar-chical figures (e.g., the letter ‘D’ composed of small ‘Y’s). Drawings were scored separately for the presence of local versus global features. Consistent with data

LHD(a) (b)

RHD LHD

A of M's

D of Y's

G of K's

S of J's

RHD

FIGURE 33.4 Example copy-from-sample drawings by LHD and RHD signers. Note that LHD signers are able to reproduce the basic configuration of the figures but can omit details, whereas the RHD signers often fail to reproduce the configural structure but include many details. Evidence of hemispatial neglect is also apparent in the RHD draw-ings (Hickok et al., 1998).

AU:Please check and clerify Hickoket al., 1998a or 1998b

Y108978_C033.indd 691 4/14/10 12:23:23 AM

692 The Handbook of Psycholinguistic and Cognitive Processes

from hearing patients, the LHD deaf subjects were significantly better at reproduc-ing global-level features, whereas the RHD deaf subjects were significantly better at reproducing local-level features (Figure 33.4; Hickok, Kirk, & Bellugi, 1998).

Overall, available evidence suggests a similar pattern of hemispheric asymme-tries for nonlinguistic spatial cognitive function in the deaf signer population.

WITHIN HEMISPHERE ORGANIZATION

functional aSPectS: SyndromeS and SymPtomS

To the extent that the types and patterns of deficits found in sign language aphasia are similar to those found in spoken language aphasia, it would suggest a com-mon functional organization for the two forms of language. There are many com-monalities in individual language deficits found; many of the aphasic symptom clusters that have been observed in LHD deaf signers fall within the bounds of classical clinical syndromes defined on the basis of hearing aphasics (Damasio, 1992; Goodglass, 1993; Goodglass & Kaplan, 1983). For example: (a) nonfluent aphasic signers have lesions involving anterior language regions; and (b) fluent aphasic signers have lesions involving posterior language regions. In addition, the range of common deficit types that have been reported in hearing aphasics have been observed regularly in sign language aphasia. Examples of these include the presence of word (i.e., sign) finding problems in most cases of aphasia, paraphasic errors, and agrammatism, and the tendency for comprehension deficits to be more closely associated with fluent aphasia than with nonfluent aphasia. In addition, the brain lesions producing these patterns of deficits in LHD signers are roughly con-sistent with clinical-anatomic correlations in hearing people (Hickok, 1992; Hickok & Bellugi, 2001; Hickok et al., 1998a; Poizner et al., 1987). To a first approximation, the within hemisphere organization of signed and spoken language appear to be remarkably similar.

role of broca’S area

Broca’s area has figured prominently in attempts to determine the anatomy of speech production. We had the opportunity to investigate the role of Broca’s area in sign language production through an in-depth case study of LHD-130, a con-genitally deaf, native user of ASL, who suffered an ischemic infarct involving the frontal operculum and inferior portion of the primary motor cortex (Figure 33.5, top; Hickok et al., 1996b). Acutely, she presented with sign “mutism,” consistent with what one might expect in a hearing/speaking individual. Chronically, she had good comprehension, fluent production with occasional sign finding prob-lems, semantic paraphasias, and what appeared to be a deficit involving the abil-ity to control bimanual movements during sign production. That deficit showed up: (a) in her tendency on one-handed signs, to “shadow,” with her nondomi-nant hand, sign-articulatory gestures carried out by her dominant; (b) in her ten-dency on two-handed signs, to assimilate the handshape and/or movement of

Y108978_C033.indd 692 4/14/10 12:23:23 AM

Neural Organization of Language: Clues From Sign Language Aphasia 693

the nondominant hand with that of the dominant hand; and (c) in her occasional failure to complete the movement of a two handed sign when the movement’s endpoint involved contact between the two hands (Figure 33.5, bottom). We were not able to find any evidence of a bimanual control deficit in nonlinguistic tasks. Blumstein (1995) has suggested that speech production errors in anterior aphasia reflects a breakdown at the phonetic (not phonemic) level caused by a loss of the ability to coordinate independent speech articulators (e.g., larynx, tongue, lips). For a signer, the two hands are independent articulators that are often required to perform independent (i.e., nonsymmetric) movements. The deficit observed in

1st attempt

Target sign: Move

SubcorticalCortical

2nd attempt

FIGURE 33.5 Example copy-from-sample drawings by LHD and RHD signers. Note that LHD signers are able to reproduce the basic configuration of the figures but can omit details, whereas the RHD signers often fail to reproduce the configural structure but include many details. Evidence of hemispatial neglect is also apparent in the RHD draw-ings (Hickok et al., 1998).

AU:Please check and clerify Hickoket al., 1998a or 1998b

Y108978_C033.indd 693 4/14/10 12:23:24 AM

694 The Handbook of Psycholinguistic and Cognitive Processes

LHD-130 may represent the sign analogue of a phonetic-level breakdown. This case suggests that Broca’s area plays an important role in sign production.

a caSe of SiGn blindneSS

“Pure Word Blindness” or “Alexia without Agraphia” has been well-documented in the literature (Friedman & Albert, 1985). Hearing/speaking patients with this disorder are typically blind in the right visual field (right homonymous hemiano-pia), have normal auditory-verbal language capacity, are able to write, but cannot read. The lesion typically involves left occipito-temporal cortex (explaining the visual field defect) and splenium of the corpus callosum. Language areas are thus preserved, allowing normal production, auditory comprehension, and writing, but according to the classical disconnection analysis of the syndrome (Geschwind, 1965), these areas are isolated from visual input (because of cortical blindness in the right visual field and deafferentation of information coming from the left visual field through the splenium). An alternative possibility is that the occipito-temporal lesion damages a visual word form area that directly affects perceptual reading centers (Beversdorf, Ratcliffe, Rhodes, & Reeves, 1997). Either way, one wonders what effects such a lesion would have on sign language comprehension in a deaf signer.

We had the opportunity to study such a case (Hickok et al., 1995). LHD-111 had a lesion involving all of the left primary visual cortex, most of area 18, with some extension into medial aspects of the temporal lobe (area 37); this lesion also clearly involved white matter fibers lateral to the splenium (Figure 33.6, top). Consistent with the neurological effects of such a lesion in hearing subjects, the deaf patient had a right visual field blindness and was alexic (i.e., she couldn’t read written English). Her comprehension was profoundly impaired; she could not follow even simple one-step (ASL) commands such as “point to the floor” (Figure 33.6, bot-tom). Her single-sign comprehension was also significantly impaired although to a lesser extent than her sentence comprehension, and virtually all of her compre-hension errors were semantic in nature. Her visual object recognition, however, was unimpaired: she had no problem naming line-drawings of objects presented to her visually. Her sign production was fluent and grammatical, although she did make occasional paraphasic errors. This case provides strong evidence favoring the view that the left hemisphere is dominant for the comprehension of ASL sen-tences in deaf individuals because it demonstrates that the right hemisphere by itself is severely constrained in its ability to process signed language. However, the case also suggests a more bilateral organization for early stages of processing in single sign recognition in that her comprehension errors seemed to indicate a semantically underspecified representation rather than a disruption of sign pho-nological information. A similar pattern of bilateral organization at early stages of spoken word recognition has been reported (Hickok & Poeppel, 2000, 2004, 2007). Finally, this case shows a dramatic difference in the within hemisphere organization of signed versus spoken language processing resulting from differ-ences in the input modality between the two language systems.

Y108978_C033.indd 694 4/14/10 12:23:24 AM

Neural Organization of Language: Clues From Sign Language Aphasia 695

LHD-111Rating scale profile of sign characteristics

Melodic line1 2 3 4 5 6 7

Phrase length

Articulatory agility

Grammatical form

Paraphasia inrunning sign

Sign finding

Sign comprehension

FIGURE 33.6 A left occipital lesion in a deaf signer (top), and a graph showing her rat-ing profile of sign characteristics. Note the patient’s production scales are in the normal or near normal range, whereas her sign comprehension is at floor (Hickok, G., Klima, E., Kritchevsky, M., & Bellugi, U. Neuropsychologia, 33, 1597–1606, 1995.)

Y108978_C033.indd 695 4/14/10 12:23:26 AM

696 The Handbook of Psycholinguistic and Cognitive Processes

neuroloGy of SiGn comPreHenSion

Auditory comprehension deficits in aphasia in hearing/speaking individuals are most closely associated with left temporal lobe damage (Bates et al., 2003; Dronkers, Wilkins, Van Valin, Redfern, & Jaeger, 2004). We investigated the relative role of the left versus right temporal lobe in the comprehension of ASL (Hickok et al., 2002). Nineteen life-long signers with unilateral brain lesions (11 LHD, 8 RHD) performed three tasks, an isolated single-sign comprehension task, a sentence-level comprehension task involving one-step commands, and a sentence-level comprehension task involving more complex multiclause/multistep commands. Performance was examined in relation to two factors: whether the lesion was in the right or left hemisphere and whether the temporal lobe was involved or not. The LHD group performed significantly worse than the RHD group on all three tasks confirming left hemisphere dominance for sign language comprehension. The group with left temporal lobe involvement was significantly impaired on all tasks, although minimally so on the single-sign task, whereas the other three groups performed at better than 95% correct on the single sign and simple sentence comprehension tasks, with performance falling off only on the complex sentence comprehension items. A comparison with previously published data (Swisher & Sarno, 1969) suggests that the degree of difficulty exhibited by the deaf RHD group on the complex sentences is comparable to that observed in hearing RHD subjects. This result suggests that language comprehension, particularly at the lexical-semantic and sentence level, depends primarily on the integrity of the left temporal lobe, independent of modality.

DISSOCIATIONS

The functional divisions within the neural systems supporting language and other cognitive abilities have been highlighted by several dissociations observed in deaf signers.

diSSociationS between linGuiStic and nonlinGuiStic SPatial abilitieS

It was noted above that LHD, but not RHD, frequently produces aphasia in deaf signers whereas RHD, but not LHD, frequently produces gross visuospatial defi-cits. This pattern of deficits constitutes a double dissociation between sublexical-, lexical-, and sentence-level aspects of spatialized linguistic ability on the one hand, and gross nonlinguistic spatial cognitive ability on the other. Additional dissociations between sign language abilities and nonlinguistic spatial abilities have been demonstrated both within the left hemisphere and within the right hemisphere. Within the left hemisphere we examined the relation between local-level visuospatial deficits evident on a drawing copy task, and several measures of sign language ability, including rate of paraphasias in running sign, single sign comprehension, and sentence-level comprehension (Hickok et al., 1998). No significant correlations were found between the hit rate for local features in the

Y108978_C033.indd 696 4/14/10 12:23:26 AM

Neural Organization of Language: Clues From Sign Language Aphasia 697

drawing copy task and any of the sign language performance measures. In fact, cases were identified in which local-level scores were near perfect, yet scores on tests of sign language ability were among the worst in the sample. This suggests that aphasic deficits cannot be reduced to a more general deficit in local-level visuospatial processing. Within the right hemisphere, two case studies have pro-vided evidence that the ability to use the spatialized referential system in ASL discourse does not depend substantially on nonlinguistic visuospatial abilities of the right hemisphere (Hickok et al., 1999). Case RHD-221 had severe visuospa-tial deficits following a large right perisylvian stroke, yet was not impaired in his ability to set up and utilize spatial loci for referential purposes. Case RHD-207, showed the reverse pattern. Her performance on standard visuospatial tasks was quite good, yet she had difficulty with spatialized aspects of ASL discourse. This finding hints at the possibility that there are nonidentical neural systems within the right hemisphere supporting spatialized discourse functions versus nonlin-guistic spatial abilities.

diSSociationS between SiGn and GeSture

Evidence supporting the view that deficits in sign language are qualitatively dif-ferent from deficits in the ability to produce and understand pantomimic gesture comes from a case study of an LHD signer (Corina et al., 1992). Following an ischemic infarct involving both anterior and posterior perisylvian regions, LHD-108 became aphasic for sign language. His comprehension was poor and his sign production was characterized by frequent paraphasias, reduced grammatical structure, and a tendency to substitute pantomime for ASL signs—a tendency not present prior to his stroke. These pantomimic gestures were used even in cases in which the gesture involved similar or more elaborate sequences of movements arguing against a complexity-based explanation of his performance. LHD-108 showed a similar dissociation in his comprehension of signs versus pantomime where he had more trouble matching a sign to a picture than matching a panto-mimed gesture to picture. This case makes the point that disruptions in sign lan-guage ability are not merely the result of more general disruptions in the ability to communicate through symbolic gesture. Since this initial report, we have seen several additional patients who show a similar tendency to use gesture in place of lexical signs.

EVIDENCE FROM FUNCTIONAL NEUROIMAGING

Lesion evidence has indicated clearly that hemispheric asymmetries for signed and spoken language are similar, and has provided some indication that the within hemisphere organization of signed language is similar in some ways to that of spoken language, but perhaps different in others. But the spatial resolution of the lesion method is relatively poor, particularly in a rare population, limit-ing the amount of information one can derive from lesion studies alone. Recent functional imaging studies have provided additional insights into the within

Y108978_C033.indd 697 4/14/10 12:23:26 AM

698 The Handbook of Psycholinguistic and Cognitive Processes

hemisphere organization of sign language processing and has highlighted both similarities and differences.

neural SyStemS underlyinG SiGned lanGuaGe Production

One of the first questions that neuroimaging studies of signed language sought to address was whether Broca’s area—a putative speech production region—was involved in signed language production. Lesion evidence has suggested it was (see above) but functional imaging promised to shed additional light on the question and further to assess whether the same degree of left lateralization might be found in sign versus speech production. This was an open question because of the bimanual nature of signed language production. Several studies of signed language production have now been published using PET as well as fMRI methodologies (Corina, San Jose-Robertson, Guillemin, High, & Braun, 2003; Emmorey, Mehta, & Grabowski, 2007; Kassubek, Hickok, & Erhard, 2004; McGuire et al., 1997; Pa, Wilson, Pickell, Bellugi, & Hickok, 2008; Petitto et al., 2000). A consistent finding that indeed Broca’s area is involved in signed language production, and this activity is strongly left dominant; this is true even when the sign articulation involved one-handed signs produced with the nondominant hand (Corina et al., 2003). Such a result could be interpreted in one of two ways. Either a canonical “speech area,” Broca’s region, has been recruited to support manual language production in deaf signers, or Broca’s area is not a speech-dedicated region. Given that Broca’s area has been impli-cated in a range of nonspeech, even nonlanguage functions in the last several years (Fink et al., 2006; Schubotz & von Cramon, 2004), the latter possibility seems likely. More research is needed to sort out the functional organization of this brain region.

Language production is not merely a function of the frontal lobe. Regions of the posterior temporal lobe also play an important role in speech production (Hickok & Poeppel, 2007; Indefrey & Levelt, 2004). Signed language produc-tion also involves nonfrontal structures including some areas in the posterior inferior temporal lobe that appear to be shared with spoken language production (Emmorey et al., 2007). Signed language production also seems to recruit struc-tures in the posterior parietal lobe that are unique to sign (Buchsbaum et al., 2005; Corina et al., 2003; Emmorey et al., 2007). The involvement of these regions may reflect sensory-motor functions that are specific to manual gestures (Buchsbaum et al., 2005; Emmorey et al., 2007).

neural SyStemS underlyinG SiGn lanGuaGe PercePtion/comPreHenSion

A number of studies of signed language perception have found activation of supe-rior temporal regions; that is, regions that are also typically implicated in spoken language perception (Binder et al., 2000; Hickok & Poeppel, 2007). One of the

Y108978_C033.indd 698 4/14/10 12:23:26 AM

Neural Organization of Language: Clues From Sign Language Aphasia 699

early fMRI studies hinted that sign perception may involve a more bilateral net-work than processing spoken language (studied via written presentation; Neville et al., 1998), but subsequent work has shown similar lateralization patterns when sign perception is compared to audio-visually presented speech (MacSweeney et al., 2002). There are differences, however, in the within hemisphere activation pat-tern for the perception of signed and spoken language. Pa et al. (2008) presented ASL nonsigns and English nonwords (forms that are phonotactically permissible in their respective languages but do not correspond to a word) to bilingual hear-ing signers who had native-level proficiency in both languages (they were hearing offspring of deaf signing adults). This allowed the direct comparison of activation patterns for sign and speech perception within the same individual’s brain. Sign perception activated large regions of occipito-temporal cortex, portions of the posterior superior temporal lobe bilaterally, posterior parietal cortex, and poste-rior frontal regions. Speech perception activated most of the superior temporal lobe and posterior frontal regions. Overlap between the two language forms was found in the posterior superior temporal cortex and in the posterior frontal lobe (Figure 33.7). Thus speech activated auditory-related areas in the superior middle and anterior temporal lobe greater than sign (consistent with the input modality for speech), and sign activated visual-related areas in the ventral occipto-temporal lobe greater than speech (consistent with the input modality for speech). Sign also activated posterior parietal areas whereas speech did not, possibly reflecting visual-manual sensory-motor mechanisms.

One fMRI study attempted to identify regions that are specifically acti-vated during sign perception above and beyond perception of nonlinguistic gestures (MacSweeney et al., 2004). These investigators showed videos of British Sign Language (BSL) or videos of a non-BSL gestural communication system (“TicTac,” a gestural code used by racetrack bookmakers) to hearing nonsigners and deaf native signers of BSL. Deaf BSL signers showed greater activation in the left posterior superior temporal sulcus and temporal-parietal boundary during the perception of BSL compared to TicTac. Hearing nonsign-ers showed no differences in their brain response within this region (although there were differences elsewhere). It is unclear from these findings exactly what this brain region is doing with respect to sign language processing, how-ever, it is interesting that in the study described above a very similar region showed responsivity to both speech and sign in hearing bilingual (English and ASL) signers. This hints that the region supports some type of linguistic specific function.

neural SubStrate for workinG memory for SiGnS

The neural basis of sign language working memory is an area that has not received extensive attention, but one that has the potential for shedding light on some fundamental questions regarding short-term memory generally. For example, the nature of the representations maintained in verbal working mem-ory is still open to debate. Some authors argue for sensory-motor based codes

Y108978_C033.indd 699 4/14/10 12:23:26 AM

700 The Handbook of Psycholinguistic and Cognitive Processes

(modality-dependant; Buchsbaum & D’Esposito, 2008; Hickok, Buchsbaum, Humphries, & Muftuler, 2003; Wilson, 2001) while others promote a modality-independent model (Baddeley, 1992; Jones & Macken, 1996). Sign language provides a unique perspective on this issue because it is possible to manipulate the sensory-motor modality, while keeping the linguistic nature of the stimuli

x = –57 x = 57

x = –49 x = 49

x = –41 x = 41

Hear speech View signs Conjuction speech+sign

FIGURE 33.7 fMRI activations during the perception of speech versus sign in hearing bilingual (English and ASL) participants (Pa, J., Wilson, S. M., Pickell, B., Bellugi, U., & Hickok, G. Journal of Cognitive Neuroscience, 20, 2198–2210, 2008.)

AU:Please review Figures 33.7 and 33.8. When printed in black and white will your figures reflect your point without seeing the color?

Y108978_C033.indd 700 4/14/10 12:23:28 AM

Neural Organization of Language: Clues From Sign Language Aphasia 701

effectively constant. One can then ask, is the neural organization of working memory for a acoustic-vocal language different from that for a visual-manual language?

One study examined working memory for sign language in deaf native sign-ers using fMRI (Buchsbaum et al., 2005) and compared its findings to a similar published study involving speech and hearing nonsigner subjects (Hickok et al., 2003). This study used a design with a several second delay period between encoding and recall, which allowed for the measurement of storage-related activity. The pattern of activation during the retention phase was substantially different from what had been found in hearing participants performing a similar task with speech. Short-term maintenance of sign language stimuli produced prominent activations in the posterior parietal lobe, which were not found in the speech study. This parietal activation was interpreted as a reflection of visual–motor integration processes. Posterior parietal regions have been implicated in visual–motor integration in both human and nonhuman primates (Andersen, 1997; Milner & Goodale, 1995), and studies of gesture-imitation in hearing subjects report parietal activation (Chaminade, Meltzoff, & Decety, 2005). It seems likely that, therefore, parietal activation in an STM task for sign language does not reflect activity of a sensory store, but instead results from sensory–motor processes underlying the interaction of storage and manual articulatory rehearsal. Additional maintenance activity was found in the posterior superior temporal lobe, as well as in posterior frontal regions, both of which have been shown to activate during maintenance of speech information, suggestive of some form of common process. However, because cross-modality comparisons could only be made between subjects and studies, it is difficult to make solid infer-ences about patterns of overlap and dissociation. No maintenance activity was found in visual-related areas in that study, such as the ventral temporal–occipital regions that are so strongly activated during sign perception. Activation in these regions would provide more convincing support for sensory-dependent working memory storage systems.

A follow-up study directly compared working memory for signed and spoken language using hearing bilinguals (native in English and ASL; Pa et al., 2008). This study clearly demonstrated sensory modality-specific activity in working memory. Delay activity was found in left ventral occipito-temporal (i.e., visual) areas dur-ing sign maintenance but not for speech maintenance, whereas delay activity was found in left superior temporal (i.e., auditory) areas for speech maintenance but not sign maintenance. However, regions of overlap were also noted in both lateral frontal lobe regions as well as a small focus in the posterior superior temporal lobe (Figure 33.8). These findings indicate there may be both modality dependent and modality independent components to working memory for linguistic stimuli.

SUMMARY

The lateralization pattern of neural systems supporting sign language processing are remarkably similar to that found for spoken language. Likewise, the patterns

Y108978_C033.indd 701 4/14/10 12:23:28 AM

702 The Handbook of Psycholinguistic and Cognitive Processes

of aphasic deficits found among brain injured deaf signers are quite recognizable in the context of aphasiological research on hearing individuals with acquired lan-guage disorders. However, differences in the neural organization of signed and spo-ken language processing have emerged in recent years. These differences appear to be tied to the unique sensory-motor demands of the two language forms.

x = –57 x = 57

x = –49 x = 49

x = –41 x = 41

Speech Signs Conjuction speech+sign

FIGURE 33.8 fMRI activations during the short-term maintenance of speech versus sign in hearing bilingual (English and ASL) participants (Pa, J., Wilson, S. M., Pickell, B., Bellugi, U., & Hickok, G. Journal of Cognitive Neuroscience, 2008.)

Y108978_C033.indd 702 4/14/10 12:23:30 AM

Neural Organization of Language: Clues From Sign Language Aphasia 703

REFERENCES

Andersen, R. (1997). Multimodal integration for the representation of space in the poste-rior parietal cortex. Philosophical Transactions of the Royal Society of London B: Biological Sciences, 352, 1421–1428.

Baddeley, A. D. (1992). Working memory. Science, 255, 556–559.Bates, E., Wilson, S. M., Saygin, A. P., Dick, F., Sereno, M. I., Knight, R. T., & Dronkers,

N. F. (2003). Voxel-based lesion-symptom mapping. Nature Neuroscience, 6(5), 448–450.

Bellugi, U., Poizner, H., & Klima, E. (1989). Language, modality, and the brain. Trends in Neurosciences, 10, 380–388.

Beversdorf, D. Q., Ratcliffe, N. R., Rhodes, C. H., & Reeves, A. G. (1997). Pure alexia: Clinical-pathologic evidence for a lateralized visual language association cortex. Clinical Neuropathology, 16(6), 328–331.

Binder, J. R., Frost, J. A., Hammeke, T. A., Bellgowan, P. S., Springer, J. A., Kaufman, J. N., & Possing, E. T. (2000). Human temporal lobe activation by speech and non-speech sounds. Cerebral Cortex, 10, 512–528.

Blumstein, S. (1995). The neurobiology of the sound structure of language. In M. S. Gazzaniga (Ed.), The cognitive neurosciences (pp. 913–929). Cambridge, MA: MIT Press.

Brownell, H. H., Potter, H. H., Bihrle, A. M., & Gardner, H. (1986). Inference deficits in right brain-damaged patients. Brain and Language, 27, 310–321.

Buchsbaum, B. R., & D’Esposito, M. (2008). The search for the phonological store: From loop to convolution. Journal of Cognitive Neuroscience, 20(5), 762–778.

Buchsbaum, B., Pickell, B., Love, T., Hatrak, M., Bellugi, U., & Hickok, G. (2005). Neural substrates for verbal working memory in deaf signers: fMRI study and lesion case report. Brain Lang, 95(2), 265–272.

Campbell, R., & Woll, B. (2003). Space is special in sign. Trends in Cognitive Sciences, 7, 5–7.

Chaminade, T., Meltzoff, A. N., & Decety, J. (2005). An fMRI study of imitation: Action representation and body schema. Neuropsychologia, 43(1), 115–127.

Corina, D., Poizner, H., Bellugi, U., Feinberg, T., Dowd, D., & O’Grady-Batch, L. (1992). Dissociation between linguistic and non-linguistic gestural systems: A case for com-positionality. Brain and Language, 43, 414–447.

Corina, D. P., San Jose-Robertson, L., Guillemin, A., High, J., & Braun, A. R. (2003). Language lateralization in a bimanual language. Journal of Cognitive Neuroscience, 15(5), 718–730.

Damasio, A. R. (1992). Aphasia. New England Journal of Medicine, 326, 531–539.Delis, D. C., Kiefner, M. G., & Fridlund, A. J. (1988). Visuospatial dysfunction follow-

ing unilateral brain damage: Dissociations in hierarchical and hemispatial analysis. Journal of Clinical and Experimental Neuropsychology, 10, 421–431.

Dronkers, N. F., Wilkins, D. P., Van Valin, Jr., R. D., Redfern, B. B., & Jaeger, J. J. (2004). Lesion analysis of brain regions involved in language comprehension. In G. Hickok & D. Poeppel (Eds.), The New Functional Anatomy of Language: A Special Issue of Cognition, 92(1–2), 145–177.

Emmorey, K. (2002). Language, cognition, and the brain: Insights from sign language research. Mahwah, NJ: Lawrence Erlbaum and Associates.

Emmorey, K., Damasio, H., McCullough, S., Grabowski, T., Ponto, L. L., Hichwa, R. D., & Bellugi, U. (2002). Neural systems underlying spatial language in American Sign Language. Neuroimage, 17(2), 812–824.

Emmorey, K., Mehta, S., & Grabowski, T. J. (2007). The neural correlates of sign versus word production. Neuroimage, 36, 202–208.

Y108978_C033.indd 703 4/14/10 12:23:30 AM

704 The Handbook of Psycholinguistic and Cognitive Processes

Fink, G. R., Manjaly, Z. M., Stephan, K. E., Gurd, J. M., Zilles, K., Amunts, K., & Marshal, J. C. (2006). A role for Broca’s area beyond language processing: Evidence from neuropsychology and fMRI. In Y. Grodzinsky & K. Amunts (Eds.), Broca’s region (pp. 254–268). Oxford, UK: Oxford University Press.

Friedman, R. B., & Albert, M. L. (1985). Alexia. In K. M. Heilman & E. Valenstein (Eds.), Clinical neuropsychology. New York, NY: Oxford University Press.

Geschwind, N. (1965). Disconnexion syndromes in animals and man. Brain, 88, 237–294, 585–644.

Goodglass, H. (1993). Understanding aphasia. San Diego, CA: Academic Press.Goodglass, H., & Kaplan, E. (1983). The assessment of aphasia and related disorders (2nd

ed.). Philadelphia, PA: Lea & Febiger.Hickok, G. (1992). Agrammatic comprehension and the trace-deletion hypothesis

(Occasional Paper 45), Cambridge, MA: MIT Center for Cognitive Science.Hickok, G., & Bellugi, U. (2001). The signs of aphasia. In R. S. Berndt (Ed.), Handbook of

neuropsychology, 2nd Edition (Vol. 3, pp. 31–50). New York, NY: Elsevier.Hickok, G., Bellugi, U., & Klima, E. S. (1998a). The neural organization of language:

Evidence from sign language aphasia. Trends in Cognitive Sciences, 2, 129–136.Hickok, G., Bellugi, U., & Klima, E. S. (1998b). What’s right about the neural organization

of sign language? A perspective on recent neuroimaging results. Trends in Cognitive Science, 2, 465–468.

Hickok, G., Buchsbaum, B., Humphries, C., & Muftuler, T. (2003). Auditory-motor inter-action revealed by fMRI: Speech, music, and working memory in area Spt. Journal of Cognitive Neuroscience, 15, 673–682.

Hickok, G., Kirk, K., & Bellugi, U. (1998). Hemispheric organization of local- and global-level visuospatial processes in deaf signers and its relation to sign language aphasia. Brain and Language, 65, 276–286.

Hickok, G., Klima, E. S., & Bellugi, U. (1996a). The neurobiology of signed language and its implications for the neural basis of language. Nature, 381, 699–702.

Hickok, G., Klima, E., Kritchevsky, M., & Bellugi, U. (1995). A case of “sign blindness” following left occipital damage in a deaf signer. Neuropsychologia, 33, 1597–1606.

Hickok, G., Kritchevsky, M., Bellugi, U., & Klima, E. S. (1996b). The role of the left fron-tal operculum in sign language aphasia. Neurocase, 2, 373–380.

Hickok, G., Love-Geffen, T., & Klima, E. S. (2002). Role of the left hemisphere in sign language comprehension. Brain and Language, 82, 167–178.

Hickok, G., Pickell, H., Klima, E. S., & Bellugi, U. (in press). Neural dissociation in the production of lexical versus classifiers signs in ASL: Distinct patterns of hemispheric asymmetry. Neuropsychologia.

Hickok, G., & Poeppel, D. (2000). Towards a functional neuroanatomy of speech percep-tion. Trends in Cognitive Sciences, 4, 131–138.

Hickok, G., & Poeppel, D. (2004). Dorsal and ventral streams: A framework for under-standing aspects of the functional anatomy of language. Cognition, 92, 67–99.

Hickok, G., & Poeppel, D. (2007). The cortical organization of speech processing. Nature Reviews Neuroscience, 8(5), 393–402.

Hickok, G., Wilson, M., Clark, K., Klima, E. S., Kritchevsky, M., & Bellugi, U. (1999). Discourse deficits following right hemisphere damage in deaf signers. Brain and Language, 66, 233–248.

Hillis, A. E. (2007). Aphasia: Progress in the last quarter of a century. Neurology, 69(2), 200–213.

Indefrey, P., & Levelt, W. J. (2004). The spatial and temporal signatures of word production components. Cognition, 92(1–2), 101–144.

Y108978_C033.indd 704 4/14/10 12:23:30 AM

Neural Organization of Language: Clues From Sign Language Aphasia 705

Jones, D. M., & Macken, W. J. (1996). Irrelevant tones produce an irrelevant speech effect: Implications for phonological coding in working memory. Journal of Experimental Psychology: Learning, Memory, and Cognition, 19, 369–381.

Kassubek, J., Hickok, G., & Erhard, P. (2004). Involvement of classical anterior and pos-terior language areas in sign language production, as investigated by 4 T functional magnetic resonance imaging. Neuroscience Letters, 364(3), 168–172.

Kirk, A., & Kertesz, A. (1994). Localization of lesions in constructional impairment. In A. Kertesz (Ed.), Localization and neuroimaging in neuropsychology (pp. 525–544). San Diego, CA: Academic Press.

Klima, E., & Bellugi, U. (1979). The signs of language. Cambridge, MA: Harvard University Press.

MacSweeney, M., Campbell, R., Woll, B., Giampietro, V., David, A. S., McGuire, P. K., . . . Brammer, M. J. (2004). Dissociating linguistic and nonlinguistic gestural com-munication in the brain. Neuroimage, 22(4), 1605–1618.

MacSweeney, M., Woll, B., Campbell, R., McGuire, P. K., David, A. S., Williams, S. C., . . . Brammer, M. J. (2002). Neural systems underlying British Sign Language and audio-visual English processing in native users. Brain, 125(Pt 7), 1583–1593.

McGuire, P. K., Robertson, D., Thacker, A., David, A. S., Kitson, N., Frackowiak, R. S., & Frith, C. D. (1997). Neural correlates of thinking in sign language. Neuroreport, 8, 695–698.

Milner, A. D., & Goodale, M. A. (1995). The visual brain in action. Oxford, UK: Oxford University Press.

Neville, H., Bavelier, D., Corina, D., Rauschecker, J., Karni, A., Lalwani, A., . . . Turner, R. (1998). Cerebral organization for language in deaf and hearing subjects: Biological constraints and effects of experience. Proceedings of the National Academy of Sciences, 95, 922–929.

Neville, H. J., Coffey, S. A., Lawson, D. S., Fischer, A., Emmorey, K., & Bellugi, U. (1997). Neural systems mediating American Sign Language: Effects of sensory experience and age of acquisition. Brain and Language, 57, 285–308.

Newport, E., & Meier, R. (1985). The acquisition of American Sign Language. In D. I. Slobin (Ed.), The crosslinguistic study of language acquisition: Volume 1: The data (pp. 881–938). HIllsdale, NJ: LEA.

Pa, J., Wilson, S. M., Pickell, B., Bellugi, U., & Hickok, G. (2008). Neural organization of linguistic short-term memory is sensory modality-dependent: Evidence from signed and spoken language. Journal of Cognitive Neuroscience, 20, 2198–2210.

Petitto, L. A., Zatorre, R. J., Gauna, K., Nikelski, E. J., Dostie, D., & Evans, A. C. (2000). Speech-like cerebral activity in profoundly deaf people processing signed languages: Implications for the neural basis of human language. Proceedings of the National Academy of Sciences, 97(25), 13961–13966.

Pickell, H., Klima, E., Love, T., Kritchevsky, M., Bellugi, U., & Hickok, G. (2005). Sign language aphasia following right hemisphere damage in a left-hander: A case of reversed cerebral dominance in a deaf signer? Neurocase, 11(3), 194–203.

Poizner, H., Klima, E. S., & Bellugi, U. (1987). What the hands reveal about the brain. Cambridge, MA: MIT Press.

Schubotz, R. I., & von Cramon, D. Y. (2004). Sequences of abstract nonbiological stim-uli share ventral premotor cortex with action observation and imagery. Journal of Neuroscience, 24(24), 5467–5474.

Swisher, L. P., & Sarno, M. T. (1969). Token Test scores of three matched patient groups: Left brain-damaged with aphasia, right brain-damaged without aphasia, non-brain damaged. Cortex, 5, 264–273.

Y108978_C033.indd 705 4/14/10 12:23:30 AM

706 The Handbook of Psycholinguistic and Cognitive Processes

Tallal, P., Miller, S., & Fitch, R. H. (1993). Neurobiological basis of speech: A case for the preeminence of temporal processing. Annals of the New York Academy of Sciences, 682, 27–47.

Wilson, M. (2001). The case for sensorimotor coding in working memory. Psychonomic Bulletin & Review, 8, 44–57.

Y108978_C033.indd 706 4/14/10 12:23:30 AM