Browse by author
Lookup NU author(s): Dr Hermann Moisl
Full text for this publication is not currently held within this repository. Alternative links are provided below where available.
Unless one is prepared to argue that existing, 'classical'formal language and automata theory, together with the natural language linguistics built on them, are fundamentally mistaken about the nature of language, any viable connectionist natural language processing (NLP) model will have to be characterizable, at least approximately, by some generative grammar or by an automaton of the corresponding class. An obvious way of ensuring that a connectionist NLP device is so characterizable is to specify it in classical terms and then to implement it in an artificial neural network, and that is what this paper does. It adopts the deterministic pushdown transducer (DPDT) as an adequate formal model for general NLP and shows how a simple recurrent network (SRN) can be trained to implement a finite state transducer (FST) which simulates the DPDT. A computer simulation of a parser for a small fragment of English is used to study the properties of the model. The conclusion is that such SRN implementation results in a device which is broadly consistent with its classical specification, but also has emergent properties relative to that specification which are desirable in an NLP device.
Author(s): Moisl HL
Publication type: Article
Publication status: Published
Journal: Connection Science
Year: 1992
Volume: 4
Issue: 2
Pages: 67-91
ISSN (print): 0954-0091
ISSN (electronic): 1360-0494
Publisher: Taylor & Francis Ltd.
URL: http://dx.doi.org/10.1080/09540099208946606
DOI: 10.1080/09540099208946606
Altmetrics provided by Altmetric