iamkissg
Ctrlk
  • PaperHighlights
  • 2019
  • 2018
    • 11
    • 6
      • Universal Language Model Fine-tuning for Text Classification
      • Semi-supervised sequence tagging with bidirectional language models
      • Consensus Attention-based Neural Networks for Chinese Reading Comprehension
      • Attention-over-Attention Neural Networks for Reading Comprehension
      • Baseline Needs More Love: On Simple Word-Embedding-Based Models and Associated Pooling Mechanisms
      • Convolutional Neural Networks for Sentence Classification
      • Deep contextualized word representations
      • Neural Architectures for Named Entity Recognition
      • Improving Language Understanding by Generative Pre-Training
      • A Sensitivity Analysis of (and Practitioners’ Guide to) Convolutional Neural Networks for Sentence C
      • Teaching Machines to Read and Comprehend
    • 5
    • 1
  • 2017
  • Paper Title as Note Title
Powered by GitBook
On this page

Was this helpful?

  1. 2018

6

Universal Language Model Fine-tuning for Text ClassificationSemi-supervised sequence tagging with bidirectional language modelsConsensus Attention-based Neural Networks for Chinese Reading ComprehensionAttention-over-Attention Neural Networks for Reading ComprehensionBaseline Needs More Love: On Simple Word-Embedding-Based Models and Associated Pooling MechanismsConvolutional Neural Networks for Sentence ClassificationDeep contextualized word representationsNeural Architectures for Named Entity RecognitionImproving Language Understanding by Generative Pre-TrainingA Sensitivity Analysis of (and Practitioners’ Guide to) Convolutional Neural Networks for Sentence CTeaching Machines to Read and Comprehend
PreviousFrustratingly Easy Meta-Embedding – Computing Meta-Embeddings by Averaging Source Word EmbeddingsNextUniversal Language Model Fine-tuning for Text Classification

Last updated 5 years ago

Was this helpful?