Deep Unsupervised Learning
NUS SoC, 2019/2020, Semester I
CS 6101 - Exploration of Computer Science Research
Thu 18:00-20:00 @ i3 #03-44 (STMI Executive Classroom)
This course is taken almost verbatim from CS 294-158-SP19 – Pieter Abbeel’s course at UC Berkeley. We are following his course’s formulation and selection of papers, with the permission of Pieter.
This is a section of the CS 6101 Exploration of Computer Science Research at NUS. CS 6101 is a 4 modular credit pass/fail module for new incoming graduate programme students to obtain background in an area with an instructor’s support. It is designed as a “lab rotation” to familiarize students with the methods and ways of research in a particular research area.
This course is also offered as a Do-it-Yourself Module (DYOM) for NUS undergraduates. Please see the Slack group channel #dyom
for details.
Our section will be conducted as a group seminar, with class participants nominating themselves and presenting the materials and leading the discussion. It is not a lecture-oriented course and not as in-depth as Abbeel’s original course at UC Berkeley, and hence is not a replacement, but rather a class to spur local interest in Deep Unsupervised Learning.
This course is offered in Session I (Weeks 3-7) and Session II (Weeks 8-13), although it is clear that the course is logically a single course that builds on the first half. Nevertheless, the material should be introductory and should be understandable given some prior study.
A mandatory discussion group is on Slack. Students and guests, please login when you are free. If you have a @comp.nus.edu.sg, @u.nus.edu, @nus.edu.sg, @a-star.edu.sg, @dsi.a-star.edu.sg or @i2r.a-star.edu.sg. email address you can create your Slack account for the group discussion without needing an invite.
For interested public participants, please send Min an email at kanmy@comp.nus.edu.sg
if you need an invite to the Slack group. The Slack group is being reused from previous semesters. Once you are in the Slack group, you can consider yourself registered for the course.
Details
Registration FAQ
-
What topics are covered?
Generative adversarial networks, variational autoencoders, autoregressive models, flow models, energy based models, compression, self-supervised learning, semi-supervised learning.
-
What are the pre-requisites?
From the original course: significant experience with probability, optimization, deep learning
For our NUS course iteration, we believe you should also follow the above pre-requisites, where possible. As with many machine learning courses, it would be useful to have basic understanding of linear algebra, machine learning, and probability and statistics. Taking online, open courses on these subjects concurrently or before the course is definitely advisable, if you do not have to requisite understanding. You might try to follow the preflight video lectures, and if these are understandable to you, then you’re all good.
-
Is the course chargeable? No, (see caveats for NUS undergraduate students) the course is not chargeable. It is free (as in no-fee). NUS allows us to teach this course for free, as it is not “taught”, per se. Students in the class take charge of the lectures, and complete a project, while the teaching staff facilitates the experience.
-
Can I get course credit for taking this? Yes, if you are a first-year School of Computing doctoral student. In this case you need to formally enroll in the course as CS6101, And you will receive one half of the 4-MC pass/fail credit that you would receive for the course, which is a lab rotation course. Even though the left rotation is only for half the semester, such students are encouraged and welcome to complete the entire course.
Yes, also for NUS undergraduate students (in any faculty). You can enrol in this class through the Do-It-Yourself Module (Group initiated) for 4 MCs. Undergraduate students must attend (physically or virtually) all 13 sessions of the course and complete the video lectures from Prof. Abbeel (in addition to the below requirements). For undergraduate students, we believe that the class is counted towards your maximum workload and is chargeable as a regular module.
No, for everyone else. By this we mean that no credits, certificate, or any other formal documentation for completing the course will be given to any other participants, inclusive of external registrants and NUS students (both internal and external to the School of Computing). Such participants get the experience of learning deep learning together in a formal study group in developing the camaraderie and network from fellow peer students and the teaching staff.
- What are the requirements for completing the course? Each student must achieve 2 objectives to be deemed to have completed the course:
- Work with peers to assist in teaching two lecture sessions of the course: One lecture by co-lecturing the subject from new slides that you have prepared a team; and another lecture as a scribe: moderating the Slack channel to add materials for discussion and taking public class notes. All lecture materials by co-lecturers and scribes will be made public.
- Complete a deep unsupervised learning project. For the project, you only need to use any deep learning framework to execute a problem against a data set. You may choose to replicate previous work from others in scientific papers or data science challenges. Or more challengingly, you may decide to use data from your own context.
-
How do external participants take this course? You may come to NUS to participate in the lecture concurrently with all of our local participants. You are also welcome to participate online through Google Hangouts. We typically have a synchronous broadcast to Google Hangouts that is streamed and archived to YouTube.
During the session where you’re responsible for co-lecturing, you will be expected to come to the class in person.
As an external participant, you are obligated to complete the course to best your ability. We do not encourage students who are not committed to completing the course to enrol.
Meeting Venue and Time
For both Sessions (I and II): 18:00-20:00, Thursdays at the STMI Executive Classroom (i3 #0-44)
For directions to NUS School of Computing (SoC) and i3: please read the directions here, to park in CP12B and/or take the bus to SoC. and use the floorplan
People
Welcome. If you are an external visitor and would like to join us, please email Kan Min-Yen to be added to the class role. Guests from industry, schools and other far-reaching places in SG welcome, pending space and time logistic limitations. The more, the merrier.
External guests will be listed here in due course once the course has started. Please refer to our Slack after you have been invited for the most up-to-date information.
NUS (Postgraduate, as CS6101): Session I (Weeks 3-7): CAI Qingpeng, SONG Kai, ZHU Fengbin
NUS (Postgraduate, as CS6101): Session II (Weeks 8-13): Dogukan Yigit POLAT, LIANG Yuxuan, Rishav CHOURASIA, WANG Wenjie, WANG Yiwei, WU Zhaomin, XU Cai
NUS (Undergraduates, as DYC1401): ANG Yi Zhe, Eloise LIM, Eugene LIM, Terence NEO, NEW Jun Jie, SHAO Yang
WING: Ibrahim Taha AKSU, HU Hengchang, Min-Yen Kan
External Guests: ANG Shen Ting, Takanori AOKI, Martin KODYŠ, LEE Xin Jie, Joni NGO Thuy Hang, Tram Anh NGUYEN, Harsh SHRIVASTAVA, Chenglei SI, Pardha VISWANADHA, Sunil Kumar YADAV, David YAM
Schedule
Schedule
Date | Description | Deadlines |
---|---|---|
Week 1 15 Aug |
Motivation / Likelihood-based Models Part I: Autoregressive Models
Lectured by: Min Scribed by: Ang Shen Ting, Sunil Kumar [ « Scribe Notes (.pdf) ] [ « Recording @ YouTube ] |
|
Week 2 22 Aug |
Likelihood-based Models: Autoregressive Models / Flow Models
Lectured by Ang Mingliang and Eugene Lim Scribed by: Shao Yang Hong, Kan Min-Yen, Joni Ngo, [ « Scribe Notes (.pdf) ] [ « Recording @ YouTube ] |
|
Week 3 29 Aug |
Lossless Compression / Flow Models
Lectured by Daniel Pyone Maung, Terence Neo, Eloise Lim, Amit Prusty Scribed by: Eugene Lim, Ang Ming Liang, Ng Wen Xian [ « Scribe Notes (.pdf) ] [ « Recording @ YouTube ] |
|
Week 4 5 Sep |
Lecture 3a: Likelihood-based Models Part II: Flow Models (ctd) (same slides as week 2) /
Lecture 3b: Latent Variable Models - part 1
Lectured by New Jun Jie, Eloise Lim, Shao Yang, Terence Neo. Scribed by: Taha Aksu, Shen Ting Ang, Song Kai, Daniel Maung [ « Scribe Notes (.pdf) ] [ « Recording @ YouTube ] |
|
Week 5 12 Sep |
Lecture 4a: Latent Variable Models - part 2 /
Lecture 4b: Bits-Back Coding
Lectured by Song Kai, Hitoshi Iwasaki and David Yam Scribed by: Terence Neo, Eloise Lim, Xueqi, Rishav Chourasia [ « Scribe Notes (.pdf) ] [ « Recording @ YouTube ] |
|
Week 6 19 Sep |
Lecture 5a: Latent Variable Models - wrap-up (same slides as Latent Variable Models - part 2) /
Lecture 5b: ANS coding (same slides as bits-back coding) /
Lecture 5c: Implicit Models / Generative Adversarial Networks
Lectured by: Ang Shen Ting, Sunil Kumar Yadav, Takanori Aoki and Qingpeng Cai Scribed by: Amit Prusty, Harsh Shrivastava and Ang Yi Zhe [ « Scribe Notes (.pdf) ] [ « Additional Lecture Slides (Ang Shenting) ] [ « Recording @ YouTube ] |
Preliminary project titles and team members due on Slack's #projects |
Recess Week 26 Sep |
Lecture 6a: Generative Adversarial Networks
Lectured by: Takanori Aoki, Qingpeng Cai Scribed by: Kevin Ling [ See consolidated scribe notes from Week 7 ] [ « Additional Lecture Slides (Takanori Aoki) ] [ « Recording @ YouTube ] |
|
Week 7 3 Oct |
Lecture 6a: Generative Adversarial Networks (ctd)
Lectured by: Ang Yi Zhe, Wong Cheng Heng, Rishav Chourasia, Wu Zhaomin Scribed by: Kevin Ling [ « Scribe Notes (.pdf) ] [ « Additional Lecture Slides (Ang Yi Zhe) ] [ « Recording @ YouTube ] |
Preliminary abstracts due to #projects
|
Week 8 10 Oct |
Lecture 7: Non-Generative Representation Learning (same slides as 6b)
Lectured by: Zhaomin Wu, Tram Anh Nguyen, Pardha Viswanadha, Liling Tan Scribed by: Fengbin Zhu, Yizhuo Zhou, Hengchang Hu [ « Scribe Notes (.pdf) ] [ « Additional Lecture Slides (Tomaso Poggio via Pardha Viswanadha) ] [ « Recording @ YouTube ] |
|
Week 9 17 Oct |
Lecture 8a: Strengths/Weaknesses of Unsupervised Learning Methods Covered Thus Far /
Lecture 8b: Semi-Supervised Learning /
Lecture 8c: Guest Lecture by Ilya Sutskever
Lectured by: Joni Ngo, Eloise Lim, Xueqi Li, Fengbin Zhu, Eugene Lim Scribed by: David Yam, Tram Anh Nguyen, Liling Tan, Yuxuan Liang [ « Scribe Notes (.pdf) ] [ « Additional Lecture Slides (Eugene Lim) ] [ « Recording @ YouTube ] |
|
Week 10 24 Oct |
Lecture 9a: Unsupervised Distribution Alignment /
Lecture 9b: Guest Lecture by Alyosha Efros
Lectured by: Hengchang Hu, New Jun Jie, Shao Yang Scribed by: Takanori Aoki, Liu Ziyang, Wong Cheng Heng [ « Scribe Notes (.pdf) ] [ « Recording @ YouTube ] |
|
Week 11 31 Oct |
No lecture due to the Singapore Symposium on Natural Language Processing (SSNLP '19). | |
Week 12 7 Nov |
Lecture 10: Language Models (Alec Radford)
Lectured by: Lee Xin Jie, Si Chenglei, Li Xueqi, Liu Ziyang, Wang Wenjie Scribed by: Hitoshi Iwasaki, Wu Zhaomin [ « Scribe Notes (.pdf) ] [ « Additional Lecture Slides (Si Chenglei and Wang Wenjie) ] [ « Recording @ YouTube ] |
|
Week 13 14 Nov |
No Lecture due to conflicts | Participation on evening of 15th STePS |
Reading Week 21 Nov |
Lecture 11: Representation Learning in Reinforcement Learning |
Projects
Student Projects
Stay tuned!
Other Links
General Texts:
- Ian Goodfellow, Yoshua Bengio and Aaron Courville Deep Learning - an MIT Press book http://www.deeplearningbook.org/
- Michael A. Nielsen, Neural Networks and Deep Learning - free, general e-book - http://neuralnetworksanddeeplearning.com/
Previous CS6101 versions run by Min:
- Deep Reinforcement Learning - http://www.comp.nus.edu.sg/~kanmy/courses/6101_1820
- Deep Learning for NLP (reprise) - http://www.comp.nus.edu.sg/~kanmy/courses/6101_1810
- Deep Learning via Fast.AI - http://www.comp.nus.edu.sg/~kanmy/courses/6101_2017_2/
- Deep Learning for Vision - http://www.comp.nus.edu.sg/~kanmy/courses/6101_2017/
- Deep Learning for NLP - http://www.comp.nus.edu.sg/~kanmy/courses/6101_2016_2/
- MOOC Research - http://www.comp.nus.edu.sg/~kanmy/courses/6101_2016/