Modern language models refute Chomsky’s approach to language
Steven Piantadosi
November 2023
 

The rise and success of large language models undermines virtually every strong claim for the innateness of language that has been proposed by generative linguistics. Modern machine learning has subverted and bypassed the entire theoretical framework of Chomsky's approach, including its core claims to particular insights, principles, structures, and processes. I describe the sense in which modern language models implement genuine theories of language, including representations of syntactic and semantic structure. I highlight the relationship between contemporary models and prior approaches in linguistics, namely those based on gradient computations and memorized constructions. I also respond to several critiques of large language models, including claims that they can't answer ``why'' questions, and skepticism that they are informative about real life acquisition. Most notably, large language models have attained remarkable success at discovering grammar without using any of the methods that some in linguistics insisted were necessary for a science of language to progress. (UPDATED: With a postscript on replies to the original draft)
Format: [ pdf ]
Reference: lingbuzz/007180
(please use that when you cite this article)
Published in:
keywords: large language model, minimalism, chomsky, generative syntax, emergent, computational modeling, statistical learning, cognitive science, syntax
previous versions: v6 [October 2023]
v5 [September 2023]
v4 [March 2023]
v3 [March 2023]
v2 [March 2023]
v1 [March 2023]
Downloaded:25128 times

 

[ edit this article | back to article list ]