Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Abstract: Biological sequences can be viewed as text and therefore analyzed using natural language processing techniques. In this presentation, I will discuss text generation via attention mechanism that underlies some large language models such as ChatGPT, and compare it with more traditional Markov Chain and LSTM techniques. I will concentrate on deriving and coding attention mechanism from scratch and explaining its principles.

Presentation: TBA (after session) (Please download and open in your computer - The Confluence wiki’s rendering of html files is not really good)

View file
nameAttention.ipynb
View file
nameAttention.html