iamkissg
search
⌘Ctrlk
iamkissg
  • PaperHighlights
  • 2019
    • 03
      • Not All Contexts Are Created Equal Better Word Representations with Variable Attention
      • Learning Context-Sensitive Word Embeddings with Neural Tensor Skip-Gram Model
      • Approximating CNNs with Bag-of-local-Features models works surprisingly well on ImageNet
      • pair2vec: Compositional Word-Pair Embeddings for Cross-Sentence Inference
      • Contextual Word Representations: A Contextual Introduction
      • Not All Neural Embeddings are Born Equal
      • High-risk learning: acquiring new word vectors from tiny data
      • Learning word embeddings from dictionary definitions only
      • Dependency-Based Word Embeddings
    • 02
    • 01
  • 2018
  • 2017
  • Paper Title as Note Title
gitbookPowered by GitBook
block-quoteOn this pagechevron-down
  1. 2019

03

Not All Contexts Are Created Equal Better Word Representations with Variable Attentionchevron-rightLearning Context-Sensitive Word Embeddings with Neural Tensor Skip-Gram Modelchevron-rightApproximating CNNs with Bag-of-local-Features models works surprisingly well on ImageNetchevron-rightpair2vec: Compositional Word-Pair Embeddings for Cross-Sentence Inferencechevron-rightContextual Word Representations: A Contextual Introductionchevron-rightNot All Neural Embeddings are Born Equalchevron-rightHigh-risk learning: acquiring new word vectors from tiny datachevron-rightLearning word embeddings from dictionary definitions onlychevron-rightDependency-Based Word Embeddingschevron-right
Previous2019chevron-leftNextNot All Contexts Are Created Equal Better Word Representations with Variable Attentionchevron-right

Last updated 5 years ago