# attention
Attention_self Attention_multi head Attention
该文档主要介绍了attention及其变种selfattention、multi-attention以及一些相关的paper
PAYING MORE ATTENTION TO ATTENTION.pdf
Attention plays a critical role in human visual experience. Furt
attention代码
attentionmodel,主要用在处理文本的seq2seq上,能够根据文中的每个词的重要性去生成权重。生成摘要,生成句子序列
Attention Flows Analyzing and Comparing Attention Mechanisms in Language Models
Attention Flows:Analyzing and Comparing Attention Mechanisms in
triplet_attention
Official PyTorch Implementation for "Rotate to Attend: Convoluti
channel_attention
Gluon implementation of channel-attention modules: SE, ECA, GCT
Attention.zip
这是李宏毅老师的机器学习课程,里面讲解了Transform模型,Attention模型,以及序列模型,PPT讲解思路非常清晰,非
Reasoning Attention and Memory
Reasoning,AttentionandMemory
深度学习的推理、注意力和记忆
SumitChopra
Facebook
structural attention源码
结构注意 新颖的神经网络体系结构设计用于对结构规则进行建模,而不是在变压器中发现规则的自我关注,这可以改善对不同长度序列的外推,
attention讲解.ppt
attention机制的整理研究。主要是encoder,deconder,transorform