Entity Tracking in Language Models

Najoung Kim, Sebastian Schuster


Abstract
Keeping track of how states of entities change as a text or dialog unfolds is a key prerequisite to discourse understanding. Yet, there have been few systematic investigations into the ability of large language models (LLMs) to track discourse entities. In this work, we present a task probing to what extent a language model can infer the final state of an entity given an English description of the initial state and a series of state-changing operations. We use this task to first investigate whether Flan-T5, GPT-3 and GPT-3.5 can track the state of entities, and find that only GPT-3.5 models, which have been pretrained on large amounts of code, exhibit this ability. We then investigate whether smaller models pretrained primarily on text can learn to track entities, through finetuning T5 on several training/evaluation splits. While performance degrades for more complex splits, we find that even when evaluated on a different set of entities from training or longer operation sequences, a finetuned model can perform non-trivial entity tracking. Taken together, these results suggest that language models can learn to track entities but pretraining on text corpora alone does not make this capacity surface.
Anthology ID:
2023.acl-long.213
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3835–3855
Language:
URL:
https://aclanthology.org/2023.acl-long.213
DOI:
10.18653/v1/2023.acl-long.213
Award:
 Area Chair Award (Interpretability and Analysis of Models for NLP)
Bibkey:
Cite (ACL):
Najoung Kim and Sebastian Schuster. 2023. Entity Tracking in Language Models. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 3835–3855, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Entity Tracking in Language Models (Kim & Schuster, ACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.acl-long.213.pdf
Video:
 https://aclanthology.org/2023.acl-long.213.mp4