figshare
Browse

Deficient Executive Control in Transformer Attention

preprint
posted on 2025-03-07, 02:10 authored by Suketu Chandrakant Patel, Hongbin Wang, Jin Fan

Although artificial intelligence (AI) has rapidly advanced with transformer-based large language models (LLMs) inspired by human attention mechanisms, whether the artificial attention in transformers has implemented some fundamental human attentional functions is elusive. Using the classic cognitive task of the color Stroop effect, we found that state-of-the-art LLMs failed in performing this task as the word list length increased. This study highlights that executive control is lacking in artificial attention architecture, which may limit AI's ability to acquire adaptive behavior that is essential for coping with conflict.

History