STATS/DATASCI 315 Fall 2022

Statistics and Artificial Intelligence



Overview

Statistical concepts are increasingly integrated into artificial intelligence applications, which often draw on a large amount of data received, transmitted, and generated by computers or networks of computers. This course introduces students to statistics and machine learning techniques such as deep neural networks, with applications to text and image data.

At the end of this course, students will be familiar with the deep learning paradigm, and will be able to analyze data using different classes of deep learning models. The course gives an introduction to the basics of deep neural networks, and their applications to various AI tasks.

Syllabus

For course policies, course requirements, and grading policies, please see the syllabus [link].

Piazza

Students should sign up Piazza [link] to join course discussions.

All communications with the teaching team (the instructor and the GSIs) should be conducted over Piazza; please do not email. If you'd like to reach the instructor or the GSIs for private questions, please post a private note on Piazza that is only visible to the instructor and the GSIs. See here for detailed instructions. The GSIs and the instructor will be monitoring piazza, endorsing correct student answers, and answering questions that remain after a discussion.

As a bonus, up to 3 percentage points will be added to your final course grade based on piazza participation. You will get (3x/100) bonus percentage points if the number of your total Piazza contributions lie in the top x-th quantile among all students. The number of Piazza contributions will be determined by Piazza class statistics.

Teaching Team, Office Hours, and Labs

  • Instructor: Yixin Wang
  • GSI:
    • Yash Patel (Office Hour: Tue 2-4pm in Angell Hall Rm G219 and Fri 2-4pm on zoom)
    • Easton Huch (Office Hour: Thurs 4-5pm in Angell Hall Rm G219 and Mon 2-4pm on zoom)
  • The website for the labs are here.

    Course Calendar

    • Lecture: Tue/Thur 10:00am-11:20pm
    • Location: East Hall 1360
    • Google Calendar: The Google Calendar below ideally contains all events and deadlines for student's convenience. Please feel free to add this calendar to your Google Calendar by clicking on the plus (+) button on the bottom right corner of the calendar below. Any adhoc changes to the schedule will be visible on the calendar first.

    Lecture Schedule

    The Schedule is subject to change.

    IDL = Introduction to Deep Learning by Charniak
    DLPy = Deep Learning with Python (2nd edition) by Chollet
    DL = Deep Learning by Goodfellow, Bengio and Courville [link]
    NNDL = Neural Networks and Deep Learning by Nielsen [link]
    D2L = Dive into Deep Learning by Zhang, Lipton, Li and Smola [link]

    Date Topic Readings

    Lecture 1

    08/30

    Introduction

    DLPy, Chap. 1
    DL, Chap. 1
    D2L, Chap. 1

    Lecture 2

    09/01

    Neural Nets as Universal Approximators

    D2L, Sec. 5.1
    DL, Sec. 6.1-4

    Lecture 3

    09/06

    Logistic Regression as a Neural Network I

    D2L, Sec. 3.4, 4.1-5

    Lecture 4

    09/08

    Logistic Regression as a Neural Network II

    DLPy, Sec. 2.4.1, 2.4.3

    Lecture 5

    09/13

    First steps with TensorFlow

    DLPy, Sec. 3.1-4, 3.5.1-2
    DLPy, Sec. 2.4.4

    Lecture 6

    09/15

    First steps with TensorFlow

    DLPy, Sec. 3.5.3-4

    Lecture 7

    09/20

    Vectorization and Linear Algebra Bootcamp I

    D2L, Sec. 19.1.1-2

    Lecture 8

    09/22

    Vectorization and Linear Algebra Bootcamp II

    D2L, Sec. 19.1.3-7, 19.1.9

    Lecture 9

    09/27

    Neural Networks I

    D2L, Sec. 5.1-2

    Lecture 10

    09/29

    Neural Networks II

    D2L, Sec. 5.3

    Lecture 11

    10/04

    Getting started with NNs: Open the Black Box of Keras

    D2L, Sec. 4.4

    Lecture 12

    10/06

    Getting started with NNs: Classification and Regression

    D2L, Sec. 5.7

    Lecture 13

    10/11

    Generalization; Evaluating ML models;

    D2L, Sec. 5.5

    Lecture 14

    10/13

    Improving model fit;

    ''

    Fall break

    10/18

    ------------

    ------------

    Lecture 15

    10/20

    Regularizing your model

    ''

    Lecture 16

    10/25

    Convolutional Neural Networks I

    D2L, Chap. 7-8

    Lecture 17

    10/27

    Convolutional Neural Networks II

    ''

    Lecture 18

    11/01

    Convolutional Neural Networks III

    ''

    Lecture 19

    11/03

    Convolutional Neural Networks IV

    ''

    Lecture 20

    11/08

    Convolutional Neural Networks V

    ''

    Lecture 21

    11/10

    Deep Learning for Time Series I

    D2L, Chap. 9-10

    Lecture 22

    11/15

    Deep Learning for Time Series II

    ''

    Lecture 23

    11/17

    Deep Learning for Time Series III

    ''

    Lecture 24

    11/22

    Deep Learning for Time Series IV

    ''

    Thanksgiving break

    11/24

    ------------

    ------------

    Lecture 25

    11/29

    Deep Generative Modeling I

    D2L, Chap. 18

    Lecture 26

    12/01

    Deep Generative Modeling II

    Lecture 27

    12/06

    Deep Generative Modeling III

    ''

    Lecture 28

    12/08

    Summary (and wiggle room)

    ------------


    Final Project

    The final project is an individual project. For requirements of the final project, please see the final project guidelines. The LaTeX template for the project report is here.

    Acknowledgements

    The course materials are adapted from the related courses offered by Alexander Amini, Alfredo Canziani, Justin Johnson, Andrew Ng, Bhiksha Raj, Grant Sanderson, Rita Singh, Ava Soleimany, and Ambuj Tewari.