liu.seSearch for publications in DiVA
Change search
Link to record
Permanent link

Direct link
BETA
Woll, Bencie
Publications (5 of 5) Show all publications
Cardin, V., Smittenaar, R. C., Orfanidou, E., Rönnberg, J., Capek, C. M., Rudner, M. & Woll, B. (2016). Differential activity in Heschl's gyrus between deaf and hearing individuals is due to auditory deprivation rather than language modality. NeuroImage, 124, 96-106
Open this publication in new window or tab >>Differential activity in Heschl's gyrus between deaf and hearing individuals is due to auditory deprivation rather than language modality
Show others...
2016 (English)In: NeuroImage, ISSN 1053-8119, E-ISSN 1095-9572, Vol. 124, p. 96-106Article in journal (Refereed) Published
Abstract [en]

Sensory cortices undergo crossmodal reorganisation as a consequence of sensory deprivation. Congenital deafness in humans represents a particular case with respect to other types of sensory deprivation, because cortical reorganisation is not only a consequence of auditory deprivation, but also of language-driven mechanisms. Visual crossmodal plasticity has been found in secondary auditory cortices of deaf individuals, but it is still unclear if reorganisation also takes place in primary auditory areas, and how this relates to language modality and auditory deprivation.

Here, we dissociated the effects of language modality and auditory deprivation on crossmodal plasticity in Heschl's gyrus as a whole, and in cytoarchitectonic region Te1.0 (likely to contain the core auditory cortex). Using fMRI, we measured the BOLD response to viewing sign language in congenitally or early deaf individuals with and without sign language knowledge, and in hearing controls.

Results show that differences between hearing and deaf individuals are due to a reduction in activation caused by visual stimulation in the hearing group, which is more significant in Te1.0 than in Heschl's gyrus as a whole. Furthermore, differences between deaf and hearing groups are due to auditory deprivation, and there is no evidence that the modality of language used by deaf individuals contributes to crossmodal plasticity in Heschl's gyrus.

Keywords
Heschl's gyrus, Deafness, Sign language, Speech, fMRI
National Category
Neurosciences
Identifiers
urn:nbn:se:liu:diva-123221 (URN)10.1016/j.neuroimage.2015.08.073 (DOI)000366646700011 ()26348556 (PubMedID)
Note

Funding agencies: Riksbankens Jubileumsfond [P2008-0481:1-E]; Swedish Council for Working Life and Social Research [2008-0846]; Swedish Research Council [349-2007-8654]; Economic and Social Research Council of Great Britain [RES-620-28-6001, RES-620-28-0002]

Available from: 2015-12-08 Created: 2015-12-08 Last updated: 2018-01-10
Cardin, V., Orfanidou, E., Kästner, L., Rönnberg, J., Woll, B., Capek, C. & Rudner, M. (2016). Monitoring Different Phonological Parameters of Sign Language Engages the Same Cortical Language Network but Distinctive Perceptual Ones. Journal of cognitive neuroscience, 28(1), 20-40
Open this publication in new window or tab >>Monitoring Different Phonological Parameters of Sign Language Engages the Same Cortical Language Network but Distinctive Perceptual Ones
Show others...
2016 (English)In: Journal of cognitive neuroscience, ISSN 0898-929X, E-ISSN 1530-8898, Vol. 28, no 1, p. 20-40Article in journal (Refereed) Published
Abstract [en]

The study of signed languages allows the dissociation of sensorimotor and cognitive neural components of the language signal. Here we investigated the neurocognitive processes underlying the monitoring of two phonological parameters of sign languages: handshape and location. Our goal was to determine if brain regions processing sensorimotor characteristics of different phonological parameters of sign languages were also involved in phonological processing, with their activity being modulated by the linguistic content of manual actions. We conducted an fMRI experiment using manual actions varying in phonological structure and semantics: (1) signs of a familiar sign language (British Sign Language), (2) signs of an unfamiliar sign language (Swedish Sign Language), and (3) invented nonsigns that violate the phonological rules of British Sign Language and Swedish Sign Language or consist of nonoccurring combinations of phonological parameters. Three groups of participants were tested: deaf native signers, deaf nonsigners, and hearing nonsigners. Results show that the linguistic processing of different phonological parameters of sign language is independent of the sensorimotor characteristics of the language signal. Handshape and location were processed by different perceptual and task-related brain networks but recruited the same language areas. The semantic content of the stimuli did not influence this process, but phonological structure did, with nonsigns being associated with longer RTs and stronger activations in an action observation network in all participants and in the supramarginal gyrus exclusively in deaf signers. These results suggest higher processing demands for stimuli that contravene the phonological rules of a signed language, independently of previous knowledge of signed languages. We suggest that the phonological characteristics of a language may arise as a consequence of more efficient neural processing for its perception and production.

National Category
Psychology (excluding Applied Psychology)
Identifiers
urn:nbn:se:liu:diva-123220 (URN)10.1162/jocn_a_00872 (DOI)000365750400003 ()26351993 (PubMedID)
Note

Funding agencies: Riksbankens Jubileumsfond [P2008-0481:1-E]; Swedish Council for Working Life and Social Research [2008-0846]; Swedish Research Council (Linnaeus Centre HEAD); Economic and Social Research Council of Great Britain [RES-620-28-6001, RES-620-28-6002]

Available from: 2015-12-08 Created: 2015-12-08 Last updated: 2017-12-01
Rudner, M., Orfanidou, E., Cardin, V., Capek, C. M., Woll, B. & Rönnberg, J. (2016). Preexisting semantic representation improves working memory performance in the visuospatial domain. Memory & Cognition, 44(4), 608-620
Open this publication in new window or tab >>Preexisting semantic representation improves working memory performance in the visuospatial domain
Show others...
2016 (English)In: Memory & Cognition, ISSN 0090-502X, E-ISSN 1532-5946, Vol. 44, no 4, p. 608-620Article in journal (Refereed) Published
Abstract [en]

Working memory (WM) for spoken language improves when the to-be-remembered items correspond to preexisting representations in long-term memory. We investigated whether this effect generalizes to the visuospatial domain by administering a visual n-back WM task to deaf signers and hearing signers, as well as to hearing nonsigners. Four different kinds of stimuli were presented: British Sign Language (BSL; familiar to the signers), Swedish Sign Language (SSL; unfamiliar), nonsigns, and nonlinguistic manual actions. The hearing signers performed better with BSL than with SSL, demonstrating a facilitatory effect of preexisting semantic representation. The deaf signers also performed better with BSL than with SSL, but only when WM load was high. No effect of preexisting phonological representation was detected. The deaf signers performed better than the hearing nonsigners with all sign-based materials, but this effect did not generalize to nonlinguistic manual actions. We argue that deaf signers, who are highly reliant on visual information for communication, develop expertise in processing sign-based items, even when those items do not have preexisting semantic or phonological representations. Preexisting semantic representation, however, enhances the quality of the gesture-based representations temporarily maintained in WM by this group, thereby releasing WM resources to deal with increased load. Hearing signers, on the other hand, may make strategic use of their speech-based representations for mnemonic purposes. The overall pattern of results is in line with flexible-resource models of WM.

Place, publisher, year, edition, pages
Springer, 2016
Keywords
Working memory, Visuospatial, Sign language, Deafness, Semantic
National Category
Psychology (excluding Applied Psychology)
Identifiers
urn:nbn:se:liu:diva-126032 (URN)10.3758/s13421-016-0585-z (DOI)000374335500007 ()26800983 (PubMedID)
Note

Funding agencies: Riksbankens jubileumsfond [P2008-0481:1-E]; Economic and Social Research Council of Great Britain [RES-620-28-6001, RES-620-28-0002]

Available from: 2016-03-11 Created: 2016-03-11 Last updated: 2018-03-21
Cardin, V., Rudner, M., Ferraz De Oliveira, R., Andin, J., Beese, L., Woll, B. & Rönnberg, J. (2015). A working memory role for superior temporal cortex in deaf individuals independently of linguistic content. In: : . Paper presented at Conference on Cognitive Hearing Science for Communication (CHCCOM2015), Linköping, June 14-17, 2015.
Open this publication in new window or tab >>A working memory role for superior temporal cortex in deaf individuals independently of linguistic content
Show others...
2015 (English)Conference paper, Oral presentation with published abstract (Refereed)
Abstract [en]

Studies of sign languages have been used to test traditional cognitive models of working memory (WM) that distinguish between verbal and visuospatial WM (e.g. Baddeley, 2003), without considering that sign languages operate in the visuospatial domain. Previous studies have shown that WM mental representations and processes are largely similar for signed and spoken languages (e.g. Rönnberg et al., 2004). However, it is not clear to what extent visual WM processes aid and support sign language WM.

Here we characterise the neural substrates supporting sign language and visual WM, and the mechanisms that subserve differential processing for signers and for deaf individuals. We conducted a functional magnetic resonance imaging (fMRI) experiment with three groups of participants: deaf native signers, hearing native signers and hearing non-signers. Participants performed a 2-back WM task and a control task on two sets of stimuli: signs from British Sign Language or non-sense objects. Stimuli were composed of point-lights to control for differences in visual features.

Our results show activation in a fronto-parietal network for WM processing in all groups, independently of stimulus type, in agreement with previous literature. We also replicate previous findings in deaf signers showing a stronger right posterior superior temporal cortex (STC) activation for visuospatial processing, and stronger bilateral STC activation for sign language stimuli.

Group comparisons further reveal stronger activations in STC for WM in deaf signers, but not for the groups of hearing individuals. This activation is independent of the linguistic content of the stimuli, being observed in both WM conditions: signs and objects. These results suggest a cognitive role for STC in deaf signers, beyond sign language processing.

National Category
General Language Studies and Linguistics
Identifiers
urn:nbn:se:liu:diva-123240 (URN)
Conference
Conference on Cognitive Hearing Science for Communication (CHCCOM2015), Linköping, June 14-17, 2015
Available from: 2015-12-09 Created: 2015-12-09 Last updated: 2018-01-10
Cardin, V., Rudner, M., Ferraz de Oliveira, R., Su, M., Andin, J., Beese, L., . . . Rönnberg, J. (2015). Does the superior temporal cortex have a role in cognitive control as a consequence of cross-modal reorganization?. In: : . Paper presented at Seventh Annual Meeting of the Society for the Neurobiology of Language, Chicago, Illinois, October 15-17, 2015.
Open this publication in new window or tab >>Does the superior temporal cortex have a role in cognitive control as a consequence of cross-modal reorganization?
Show others...
2015 (English)Conference paper, Oral presentation with published abstract (Refereed)
National Category
Basic Medicine
Identifiers
urn:nbn:se:liu:diva-123289 (URN)
Conference
Seventh Annual Meeting of the Society for the Neurobiology of Language, Chicago, Illinois, October 15-17, 2015
Available from: 2015-12-09 Created: 2015-12-09 Last updated: 2018-01-10
Organisations

Search in DiVA

Show all publications