Since this session focuses on the DL / NLP, we assume the participants have solid machine learning background and know basics of deep learning. The DL field contains tons of research topics. It isn't practical to randomly select topics to discuss, nor to present ABCs of DL step by step; otherwise, even one semester is not enough.
We recommend the people who are doing DL research right now to join in the discussion. We will discuss several interesting papers every week, shortly after briefly reviewing basics.
- warm-up - two or three sessions
- briefly review concepts of DL to make sure everybody on the same page
- topics: logistic regression, softmax, SVM, neural network architecture, activation function, initialization methods, optimization methods, other tricks
- paper discussion
- We encourage everybody to finish reading the selected papers within one week and comment on them in the following session.
- We pick papers every Wednesday. Every participant sends a PR or modification to papers.md to propose papers. We vote and select papers for the next week by the midnight of every Wednesday.
Here is several DL material summarizations
- https://github.com/priyaank/deep-learning
- https://github.com/kjw0612/awesome-deep-vision#image-captioning
- https://github.com/kjw0612/awesome-rnn
- https://github.com/ChristosChristofidis/awesome-deep-learning
DL research grows fast, following arXiv might be necessary.
- http://arxiv.org/list/cs.CL/recent
- http://arxiv.org/list/cs.CV/recent
- http://arxiv.org/list/cs.LG/recent
- basic machine learning concepts: classification, KNN, SVM, softmax
- reading DLBook Chapter 5
- CS231n classification notes, linear classifier notes
- CS231n Lecture 1, 2, 3