Papers I Read Notes and Summaries

Learning to Compute Word Embeddings On the Fly

Introduction

  • Word based language models suffer from the problem of rare or Out of Vocabulary (OOV) words.

    ... Continue reading

R-NET - Machine Reading Comprehension with Self-matching Networks

Introduction

  • R-NET is an end-to-end trained neural network model for machine comprehension.

  • It starts by... Continue reading


ReasoNet - Learning to Stop Reading in Machine Comprehension

Introduction

  • In the domain of machine comprehension, making multiple passes over the given document is an effective... Continue reading


Principled Detection of Out-of-Distribution Examples in Neural Networks

Problem Statement

  • Given a pre-trained neural network, which is trained using data from some distribution P (referred... Continue reading


Ask Me Anything - Dynamic Memory Networks for Natural Language Processing

Introduction

  • Dynamic Memory Networks (DMN) is a neural network based general framework that can be used for... Continue reading


One Model To Learn Them All

  • The current trend in deep learning is to design, train and fine tune a separate model for each... Continue reading


Two/Too Simple Adaptations of Word2Vec for Syntax Problems

  • The paper proposes two variants of Word2Vec model so that it may account for syntactic properties of words and... Continue reading

A Decomposable Attention Model for Natural Language Inference

Introduction

  • The paper proposes an attention based mechanism to decompose the problem of Natural Language Inference (NLI) into... Continue reading

A Fast and Accurate Dependency Parser using Neural Networks

Introduction

  • The paper proposes a neural network classifier to perform transition-based dependency parsing using dense vector representation for... Continue reading

Neural Module Networks

Introduction