Article published in:
Methodological and Analytic Frontiers in Lexical Research (Part I)Edited by Gonia Jarema, Gary Libben and Chris Westbury
[The Mental Lexicon 5:3] 2010
► pp. 401–420
Towards a localist-connectionist model of word translation
Ton Dijkstra | Donders Institute for Brain, Cognition, and Behaviour
Steven Rekké | Radboud University Nijmegen
Word translation is among the most sophisticated skills that bilinguals can perform. Brysbaert and Duyck (2010) have argued that the Revised Hierarchical Model (RHM; Kroll & Stewart, 1994), a verbal model for word translation in beginning and proficient bilinguals, should be abandoned in favor of connectionist models such as the Bilingual Interactive Activation Plus model (BIA+; Dijkstra & Van Heuven, 2002). However, the partially implemented BIA+ model for bilingual word recognition has neither been applied to bilinguals of different proficiency levels nor extended to the complex process of word translation. After considering a number of aspects of the RHM, a new localist-connectionist model, called Multilink, is formulated to account for the performance of bilinguals differing in their L2 proficiency in different tasks: lexical decision, language decision, and word translation.
Keywords: bilingual, word recognition, word translation, computational models, semantic priming, cognate processing
Published online: 17 February 2011
https://doi.org/10.1075/ml.5.3.08dij
https://doi.org/10.1075/ml.5.3.08dij
Cited by
Cited by 11 other publications
Chaouch-Orozco, Adel, Jorge González Alonso & Jason Rothman
Ferrer-Xipell, Roser
Jouravlev, Olessia & Debra Jared
Kootstra, Gerrit Jan, Ton Dijkstra & Marianne Starren
Marull, Crystal H.
Mulder, Kimberley, Ton Dijkstra, Robert Schreuder & Harald R. Baayen
Tulkens, Stéphan, Dominiek Sandra & Walter Daelemans
Vanlangendonck, Flora, David Peeters, Shirley-Ann Rueschemeyer & Ton Dijkstra
Wen, Yun & Walter J. B. van Heuven
This list is based on CrossRef data as of 07 february 2021. Please note that it may not be complete. Sources presented here have been supplied by the respective publishers. Any errors therein should be reported to them.