...
Abstract: Biological sequences can be viewed as text and therefore analyzed using natural language processing techniques. In this presentation, I will discuss text generation via attention mechanism that underlies some large language models such as ChatGPT, and compare it with more traditional Markov Chain and LSTM techniques. I will concentrate on deriving and coding attention mechanism from scratch and explaining its principles.
Presentation: TBA (after session)
View file | ||
---|---|---|
|
View file | ||
---|---|---|
|