This lecture considers the importance of representations and how deep learning intersects with structured prediction. We will first go over a review of neural networks, followed by a discussion of various deep neural network models that have been shown to be recently successful across various applications.
Lectures and Readings
-
Goodfellow, Ian, Yoshua Bengio, and Aaron Courville. Deep learning. MIT Press, 2016.
-
Ruslan Salakhutdinov and Geoffrey Hinton, Deep Boltzmann Machines, AISTATS 2009.
-
Richard Socher, John Bauer, Christopher D. Manning, and Andrew Y. Ng, Parsing With Compositional Vector Grammars. ACL 2013.
-
Sutskever, Ilya, Oriol Vinyals, and Quoc V. Le. Sequence to sequence learning with neural networks. NIPS 2014.
-
Dyer, Chris, Adhiguna Kuncoro, Miguel Ballesteros, and Noah A. Smith. Recurrent Neural Network Grammars In Proceedings of NAACL-HLT, pp. 199-209. 2016.
-
Kim, Yoon, Carl Denton, Luong Hoang, and Alexander M. Rush. Structured attention networks arXiv preprint arXiv:1702.00887 (2017).