Skip to content

s2244521/homework1-1

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 

Repository files navigation

Homework1 - Introduce a New NN with Memory

我們介紹的paper"CNN-RNN: A Unified Framework for Multi-label Image Classification" Baidu Reasch in China發表在CVPR2016

組員:賴筱婷,鄭乃嘉,周育潤,翁慶年

負責部份:

Introduction:賴筱婷

Proposed Method:翁慶年

Experiment and Conclusion:周育潤,鄭乃嘉

Motivation

本篇論文主要想解決multi-label classification的問題,利用RNN(LSTM)的特性去model label co-occurrence dependency.

  • Sumit Chopra from Facebook. Reasoning, Attention and Memory slides
  • Edward Grefenstette from Google DeepMind. Beyond Seq2Seq with Augmented RNNs slides

To-Do

  • [+10] Please find a recent paper (2014-2015) which introduced a NN with memory.
  • [+50] Write a report to briefly introduce the paper;
  • [+40] then, focus on discussing the unique properties of the new NN and where it can be applied to take advantage of the properties.

Candidates

  • Search RNN on Arxiv-sanity link
  • Jianpeng Cheng et al. Long Short-Term Memory-Networks for Machine Reading. arXiv16’.
  • Nal Kalchbrenner et al. Grid Long Short-Term Memory. arXiv16’. (From DeepMind, Alex)
  • Kaisheng Yao et al. Depth-Gated LSTM. arXiv15’.
  • Shuohang Wang et al. Learning Natural Language Inference with LSTM. arXiv15’.
  • Junyoung Chung et al. Gated Feedback Recurrent Neural Networks. arXiv15’.

Other

  • Due on Oct. 3rd before class.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published