In this repository, you can find the slides for the "introduction to LLMs". The goal of the talk was to explain the meaning behind the attention equation.
In addition to the slides, I prepared a video explaining the attention mechanism: https://youtu.be/vsN7lxIzrxY?si=0lWvmYlD64taj002