当前位置:
X-MOL 学术
›
Current Directions in Psychological Science
›
论文详情
Our official English website, www.x-mol.net, welcomes your
feedback! (Note: you will need to create a separate account there.)
The Antecedents of Transformer Models
Current Directions in Psychological Science ( IF 7.4 ) Pub Date : 2024-11-18 , DOI: 10.1177/09637214241279504 Simon Dennis, Kevin Shabahang, Hyungwook Yim
Current Directions in Psychological Science ( IF 7.4 ) Pub Date : 2024-11-18 , DOI: 10.1177/09637214241279504 Simon Dennis, Kevin Shabahang, Hyungwook Yim
Transformer models of language represent a step change in our ability to account for cognitive phenomena. Although the specific architecture that has garnered recent interest is quite young, many of its components have antecedents in the cognitive science literature. In this article, we start by providing an introduction to large language models aimed at a general psychological audience. We then highlight some of the antecedents, including the importance of scale, instance-based memory models, paradigmatic association and systematicity, positional encodings of serial order, and the learning of control processes. This article offers an exploration of the relationship between transformer models and their precursors, showing how they can be understood as a next phase in our understanding of cognitive processes.
中文翻译:
Transformer 模型的前因
语言的 transformer 模型代表了我们解释认知现象的能力的阶跃变化。尽管最近引起人们兴趣的特定架构还很年轻,但它的许多组件在认知科学文献中都有先例。在本文中,我们首先介绍了针对一般心理受众的大型语言模型。然后,我们重点介绍了一些前因,包括规模的重要性、基于实例的内存模型、范式关联和系统性、序列顺序的位置编码以及控制过程的学习。本文探讨了 transformer 模型与其前体之间的关系,展示了如何将它们理解为我们理解认知过程的下一阶段。
更新日期:2024-11-18
中文翻译:
Transformer 模型的前因
语言的 transformer 模型代表了我们解释认知现象的能力的阶跃变化。尽管最近引起人们兴趣的特定架构还很年轻,但它的许多组件在认知科学文献中都有先例。在本文中,我们首先介绍了针对一般心理受众的大型语言模型。然后,我们重点介绍了一些前因,包括规模的重要性、基于实例的内存模型、范式关联和系统性、序列顺序的位置编码以及控制过程的学习。本文探讨了 transformer 模型与其前体之间的关系,展示了如何将它们理解为我们理解认知过程的下一阶段。