Skip to content

Stockholm-AI/local-papers-test

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 

Repository files navigation

alt text

Local AI Papers - Stockholm

Explain the concept paragraph here

Attention

Multi-Head Attention

by Vaswani et al. at KTH in Attention Is All You Need

Multi-head Attention is a module for attention mechanisms which runs through an attention mechanism several times in parallel. The independent attention outputs are then concatenated and linearly transformed into the expected dimension. Intuitively, multiple attention heads allows for attending to parts of the sequence differently (e.g. longer-term dependencies versus shorter-term dependencies).

Semi-Supervised Learning

Natural Language Processing

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published