iamkissg
search
Ctrlk
  • PaperHighlights
  • 2019chevron-right
  • 2018chevron-right
    • 11chevron-right
    • 6chevron-right
    • 5chevron-right
      • Text Understanding with the Attention Sum Reader Network
      • Effective Approaches to Attention-based Neural Machine Translation
      • Distance-based Self-Attention Network for Natural Language Inference
      • Deep Residual Learning for Image Recognition
      • U-Net: Convolutional Networks for Biomedical Image Segmentation
      • Memory Networks
      • Neural Machine Translation by Jointly Learning to Align and Translate
      • Convolutional Sequence to Sequence Learning
      • An Empirical Evaluation of Generic Convolutional and Recurrent Networks for Sequence Modeling
      • Graph Attention Networks
      • Attention is All You Need
      • DiSAN: Directional Self-Attention Network for RNN/CNN-Free Language Understanding
      • A Structured Self-attentive Sentence Embedding
      • Hierarchical Attention Networks for Document Classification
      • Grammar as a Foreign Language
      • Show, Attend and Tell: Neural Image Caption Generation with Visual Attention
      • Transforming Auto-encoders
      • Self-Attention with Relative Position Representations
    • 1chevron-right
  • 2017chevron-right
  • Paper Title as Note Title
gitbookPowered by GitBook
block-quoteOn this pagechevron-down
  1. 2018

5

Text Understanding with the Attention Sum Reader Networkchevron-rightEffective Approaches to Attention-based Neural Machine Translationchevron-rightDistance-based Self-Attention Network for Natural Language Inferencechevron-rightDeep Residual Learning for Image Recognitionchevron-rightU-Net: Convolutional Networks for Biomedical Image Segmentationchevron-rightMemory Networkschevron-rightNeural Machine Translation by Jointly Learning to Align and Translatechevron-rightConvolutional Sequence to Sequence Learningchevron-rightAn Empirical Evaluation of Generic Convolutional and Recurrent Networks for Sequence Modelingchevron-rightGraph Attention Networkschevron-rightAttention is All You Needchevron-rightDiSAN: Directional Self-Attention Network for RNN/CNN-Free Language Understandingchevron-rightA Structured Self-attentive Sentence Embeddingchevron-rightHierarchical Attention Networks for Document Classificationchevron-rightGrammar as a Foreign Languagechevron-rightShow, Attend and Tell: Neural Image Caption Generation with Visual Attentionchevron-rightTransforming Auto-encoderschevron-rightSelf-Attention with Relative Position Representationschevron-right
PreviousTeaching Machines to Read and Comprehendchevron-leftNextText Understanding with the Attention Sum Reader Networkchevron-right

Last updated 5 years ago

Was this helpful?

Was this helpful?