Skip to content

Slides and video explaining the attention mechanism in Transformers

Notifications You must be signed in to change notification settings

weissfennek/LLM

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 

Repository files navigation

LLM

In this repository, you can find the slides for the "introduction to LLMs". The goal of the talk was to explain the meaning behind the attention equation.

In addition to the slides, I prepared a video explaining the attention mechanism: https://youtu.be/vsN7lxIzrxY?si=0lWvmYlD64taj002

About

Slides and video explaining the attention mechanism in Transformers

Resources

Stars

Watchers

Forks

Releases

No releases published