One of the truisms always repeated by cognitivists and proponents of the physical symbol system hypothesis is that a natural system, like a neural network, cannot learn a language without prior encoding. This why people like Chomsky and Foder assert that we have innate linguistic structures encoded at birth, and that (therefore) learning is a matter of rule formation and the construction of models and representations. I have never believed this. Gradually, slowly, over time, the evidence has been piling up in the opposite direction. Specifically, we are learning that very simple neural networks can do very complex things, like learn languages. This journal article is a case in point. The research describes a system "made up of two million interconnected artificial neurons, able to learn to communicate using human language starting from a state of 'tabula rasa', only through communication with a human interlocutor."
Today: 0 Total: 19 [Share]
] [View full size