LOADING
LOADING
This paper examines the structural parallels between transformer-based attention mechanisms and the cognitive processes underlying poetic attention. We argue that the multi-headed attention paradigm offers a computational metaphor for how poets simultaneously attend to sound, meaning, and form. Through analysis of both neural network architectures and close readings of contemporary poetry, we propose a framework for understanding creative attention as a form of weighted relevance.