NLP - Attention Is All You Need

In this paper, authors proposed a new simple network architecture, the Transformer, based solely on attention mechanisms, removing convolutions and recurrences entirely. Transformer is the first transduction model relying entirely...

NLP - Neural Machine Translation by jointly learning to align and translate

In this paper, authors conjectured that the use of a fixed-length vector is a bottleneck in improving the performance of this basic encoder-decoder architecture. Therefore, authors extended the the basic...

NLP - Summary of Sequence to Sequence with Neural Networks

The authors developed a straightforward application of the Long Short-Term Memory (LSTM) architecture which can solve English to French translation.

Action Recognition - Summary of Two Steam CNN paper

The authors developed a two stream convolutional newtork model for action recognition in videos. They investigated a different architecture based on two seperate recognition streams i.e, spatial and temporal recognition,...

Summary of Faster R-CNN Paper

The authors developed a faster r-cnn newtork model for object detection. This network introduced a region proposal network (RPN) that shares full-image convolutional features with detection network (Fast R-CNN), thus...