Modern language models refute Chomsky’s approach to language
Steven Piantadosi
July 2024
 

Modern machine learning has subverted and bypassed the theoretical framework of Chomsky’s generative approach to linguistics, including its core claims to particular insights, principles, structures, and processes. I describe the sense in which modern language models implement genuine theories of language, and I highlight the links between these models and approaches to linguistics that are based on gradient computations and memorized constructions. I also describe why these models undermine strong claims for the innateness of language and respond to several critiques of large language models, including arguments that they can’t answer “why” questions and skepticism that they are informative about real life acquisition. Most notably, large language models have attained remarkable success at discovering grammar without using any of the methods that some in linguistics insisted were necessary for a science of language to progress. (UPDATED: With a postscript on replies to the original draft)
Format: [ pdf ]
Reference: lingbuzz/007180
(please use that when you cite this article)
Published in:
keywords: large language model, minimalism, chomsky, generative syntax, emergent, computational modeling, statistical learning, cognitive science, syntax
previous versions: v7 [November 2023]
v6 [October 2023]
v5 [September 2023]
v4 [March 2023]
v3 [March 2023]
v2 [March 2023]
v1 [March 2023]
Downloaded:29685 times

 

[ edit this article | back to article list ]