Abstract
A Hebbian-inspired, competitive network is presented which learns to predict the typical semantic features of denoting terms in simple and moderately complex sentences. In addition, the network learns to predict the appearance of syntactically key words, such as prepositions and relative pronouns. Importantly, as a by-product of the network's semantic training, a strong form of syntactic systematicity emerges. This systematicity is exhibited even at a novel, deeper level of clausal embedding. All network training is unsupervised with respect to error feedback. A novel variant of competitive learning and an unusual hierarchical architecture are presented. The relationship of this work to issues raised by Marcus and Phillips is explored.