AAAI Conference 2025 System Paper
TRANSFORMER EXPLAINER: Interactive Learning of Text-Generative Models
- Aeree Cho
- Grace C. Kim
- Alexander Karpekov
- Alec Helbling
- Zijie J. Wang
- Seongmin Lee
- Benjamin Hoover
- Duen Horng (Polo) Chau
Transformers have revolutionized machine learning, yet their inner workings remain opaque to many. We present TRANSFORMER EXPLAINER, an interactive visualization tool designed for non-experts to learn about Transformers through the GPT-2 model. Our tool helps users understand complex Transformer concepts by integrating a model overview and smooth transitions across abstraction levels of math operations and model structures. It runs a live GPT-2 model locally in the user’s browser, empowering users to experiment with their own input and observe in real-time how the internal components and parameters of the Transformer work together to predict the next tokens. 125,000 users have used our open-source tool at https://poloclub.github.io/ transformer-explainer/.