I will be teaching a graduate course at Washington University in St. Louis for the
fall 2016 semester in the Master of Science in Information Systems (MSIS) program. You can see the course description here.
T81 INFO 558 Applications of Deep Neural Networks
Deep learning is a group of exciting new technologies for neural networks. By using a
combination of advanced training techniques neural network architectural components, it
is now possible to train neural networks of much greater complexity. This course will
introduce the student to deep belief neural networks, regularization units (ReLU),
convolution neural networks and recurrent neural networks. High performance
computing (HPC) aspects will demonstrate how deep learning can be leveraged both on
graphical processing units (GPUs), as well as grids. Deep learning allows a model to
learn hierarchies of information in a way that is similar to the function of the human
brain. Focus will be primarily upon the application of deep learning, with some
introduction to the mathematical foundations of deep learning. Students will use the
Python programming language to architect a deep learning model for several of real-world
data sets and interpret the results of these networks.
I will make use of my book on deep learning, as well as some supplementary information.
This will be a hands-on technical course, so we will use the Python programming language
to make use of deep neural networks. At this point I believe that I will make use of
both the Google TensorFlow and Microsoft CNTK
I feel that both of these will give the students an easy install path, depending on their system.
Currently, CNTK has an easy install path for Linux and Windows (but not Mac); TensorFlow has
an easy install path for Mac and Linux (but not Windows). I am fond of the Caffe package as well, but it does have an easy install path for Windows.
For my own use I make use of Theano, but it does not have a particularly easy install path for anything! What every path I end up taking, I will update my books examples to include it.
My schedule, subject to change, is:
- 08/29/2016: Class 01: Python for Machine Learning
- 09/05/2016: Labor Day, No class
- 09/12/2016: Class 02: Neural Network Basics, Ch1
- 09/19/2016: Class 03: Training a Neural Network, Ch 4, Ch 5
- 09/26/2016: Class 04: Introduction to TensorFlow
- 10/03/2016: Class 05: Modeling and Kaggle
- 10/10/2016: Class 06: Backpropagation, Ch 6
- 10/17/2016: Class 07: Neural Networks for Classification
- 10/24/2016: Class 08: Neural Networks for Regression
- 10/31/2016: Class 09: Preprocessing
- 11/07/2016: Class 10: Regularization and Dropout, Ch 12
- 11/14/2016: Class 11: Timeseries and Recurrent, Ch 13
- 11/21/2016: Class 12: Convolutional Neural Networks, Ch 10
- 11/28/2016: Class 13: Architecting Neural Networks, Ch 14
- 12/05/2016: Class 14: Special Applications of Neural Networks
- 12/12/2016: Class 15: GPU, HPC and Cloud
- 12/19/2016: Final Exam
At this point I am planning on two of the assignments using Kaggle for Class. I think it
will be a great way to introduce the students to Kaggle.
This will be my first time teaching a graduate course. I’ve taught undergrad at Maryville University and St. Louis Community College,
but its been a few years. I’ve always taught technical courses, such as various levels of Java, C++, C# and SQL. WUSTL is the
university that I got my masters degree from, so I am somewhat familiar with them.
I will have a website that will contain all of my course information, I will post more information on that once the course starts.