Papers I Read Notes and Summaries

Kronecker Recurrent Units

Introduction

  • Recurrent Neural Networks have two key issues:


Learning Independent Causal Mechanisms

Introduction

  • The paper presents a very interesting approach for learning independent (inverse) data transformation from a set... Continue reading


Memory-based Parameter Adaptation

Introduction

  • Standard Deep Learning networks are not suitable for continual learning setting as the change in the... Continue reading


Born Again Neural Networks

Introduction

  • The paper explores knowledge distillation (KD) from the perspective of transferring knowledge between 2 networks of... Continue reading


Net2Net-Accelerating Learning via Knowledge Transfer

Notes

  • The paper presents a simple yet effective approach for transferring knowledge from a trained neural network... Continue reading


Learning to Count Objects in Natural Images for Visual Question Answering

Introduction

  • Most of the visual question-answering (VQA) models perform poorly on the task of counting objects in an... Continue reading

Neural Message Passing for Quantum Chemistry

Introduction

  • The paper presents a general message passing architecture called as Message Passing Neural Networks (MPNNs) that... Continue reading


Unsupervised Learning by Predicting Noise

Introduction

  • Convolutional Neural Networks are extremely good feature extractors in the sense that features extracted for one... Continue reading


The Lottery Ticket Hypothesis - Training Pruned Neural Networks

Introduction

  • Empirical evidence indicates that at training time, the neural networks need to be of significantly larger... Continue reading


Cyclical Learning Rates for Training Neural Networks

Introduction

  • Conventional wisdom says that when training neural networks, learning rate should monotonically decrease. This insight forms... Continue reading