ZurichNLP #17
Tue 17 Jun
|Zürich
Aleksander Ficek (NVIDIA) on synthetic generators & verifiers for coding and Matteo Saponati (ETH Zurich) on the structures of self-attention beyond keys and queries.


Time & Location
17 Jun 2025, 18:00 – 20:00
Zürich, OAT ETH Zurich (14th floor), Andreasstrasse 5, 8050 Zürich, Switzerland
About the Event
Aleksander Ficek (NVIDIA): Synthetic Generators and Verifiers for Coding TBD Matteo Saponati (ETH Zurich): Structure of Self-Attention Beyond Queries and Keys
Self-attention is essential to Transformer architectures, yet how information is embedded in the self-attention matrices and how different objective functions impact this process remains unclear. We present a mathematical framework to analyze self-attention matrices by deriving the structures governing their weight updates. Using this framework, we demonstrate that bidirectional training induces symmetry in the weight matrices, while autoregressive training results in directionality and column dominance. Our theoretical findings are validated across multiple Transformer models - including ModernBERT, GPT, LLaMA3, and Mistral - and input modalities like text, vision, and audio. Finally, we apply these insights by showing that symmetric initialization improves the performance of encoder-only models on language tasks. This mathematical analysis offers a novel theoretical perspective on how information is embedded through self-attention, thereby improving the interpretability of Transformer…