50 years after the perceptron, 25 years after PDP: Neural computation in language sciences Julien Mayor, Pablo Gomez, Franklin Chang, Gary Lupyan This Research Topic aims to showcase the state of the art in language research while celebrating the 25th anniversary of the tremendously influential work of the PDP group, and the 50th anniversary of the perceptron. Although PDP models are often the gold standard to which new models are compared, the scope of this Research Topic is not constrained to connectionist models. Instead, we aimed to create a landmark forum in which experts in the field define the state of the art and future directions of the psychological processes underlying language learning and use, broadly defined. We thus called for papers involving computational modeling and original research as well as technical, philosophical, or historical discussions pertaining to models of cognition. We especially encouraged submissions aimed at contrasting different computational frameworks, and their relationship to imaging and behavioral data. |
Contents
legacy and future challenges | 4 |
Recurrent temporal networks and language acquisitionfrom corticostriatal neurophysiology to reservoir computing | 7 |
investigating the continuum from catastrophic forgetting to agelimited learning effects | 21 |
An amodal shared resource model of languagemediated visual attention | 24 |
a historical and tutorial review | 40 |
a tutorial overview | 65 |
Spoken word recognition without a TRACE | 79 |
Deep generative learning of locationinvariant visual word recognition | 96 |
A computational model to investigate assumptions in the headturn preference procedure | 106 |
Experience and generalization in a connectionist model of Mandarin Chinese relative clause processing | 121 |
Selforganizing map models of language acquisition | 140 |
a connectionist developmental approach to verbal analogies | 155 |
learning nouns over developmental time in atypical populations and individuals | 169 |