I read papers, and here are my highlights.
Learning Word Meta-Embeddingsarrow-up-right
Frustratingly Easy Meta-Embedding – Computing Meta-Embeddings by Averaging Source Word Embeddingsarrow-up-right
Teaching Machines to Read and Comprehendarrow-up-right
Attention-over-Attention Neural Networks for Reading Comprehensionarrow-up-right
Consensus Attention-based Neural Networks for Chinese Reading Comprehensionarrow-up-right
Convolutional Neural Networks for Sentence Classificationarrow-up-right
Deep contextualized word representationsarrow-up-right
A Sensitivity Analysis of (and Practitioners’ Guide to) Convolutional Neural Networks for Sentence Classificationarrow-up-right
Attention Is All You Needarrow-up-right
Neural Machine Translation by Jointly Learning to Align and Translatearrow-up-right
U-Net: Convolutional Networks for Biomedical Image Segmentationarrow-up-right
Transforming Auto-encodersarrow-up-right
Text Understanding with the Attention Sum Reader Networkarrow-up-right
An Empirical Evaluation of Generic Convolutional and Recurrent Networks for Sequence Modelingarrow-up-right
Show, Attend and Tell: Neural Image Caption Generation with Visual Attentionarrow-up-right
Self-Attention with Relative Position Representationsarrow-up-right
Deep Residual Learning for Image Recognitionarrow-up-right
Memory Networksarrow-up-right
Hierarchical Attention Networks for Document Classificationarrow-up-right
Graph Attention Networksarrow-up-right
Grammar as a Foreign Languagearrow-up-right
Effective Approaches to Attention-based Neural Machine Translationarrow-up-right
Distance-based Self-Attention Network for Natural Language Inferencearrow-up-right
Convolutional Sequence to Sequence Learningarrow-up-right
A Structured Self-attentive Sentence Embeddingarrow-up-right
DiSAN: Directional Self-Attention Network for RNN/CNN-Free Language Understandingarrow-up-right
2018arrow-up-right
1arrow-up-right
2017arrow-up-right
12arrow-up-right
11arrow-up-right
Last updated 6 years ago