COMP 5212: Machine Learning [Fall 2023]

Wednesday, Friday 16:30-17:50 @ Room 6591 (Lift 31-32)

Overview

Announcements

[8 Sep 2023] Today’s lecture will be online over ZOOM due to the heavy rain.
[1 Sep 2023] Welcome to COMP5212!

Description

In this course, we will cover some classical and advanced algorithms in machine learning. Topics include: Linear models (linear/logistic regression, support vector machines), Non-linear models (tree-based methods, kernel methods, neural networks), learning theory (hypothesis space, bias/variance tradeoffs, VC dimensions). The course will also discuss some advanced topics of machine learning such as testing-time integrity in trustworthy machine learning and neural architecture search in AutoML.

Prerequisites

Basic knowledge in numerical linear algebra, probability, and calculus.

Grading Policy

Late submission policy:

Late submissions are accepted up to 2 days after the due date, with 10% (of the total grade of the item) penalty per day.

Term projects

Students will work on a open-topic research project with groups. Each group could only be consisted with less or equal than 4 members (<=4). Feel free to discuss with me offline for the topic choice.

Tentative Schedule and Material

DateTopicSlidesReadings&linksAssignments
Wed 6/9Overview of Machine Learninglecture_0  
Fri 8/9Math Basicslecture_1Matrix Calculus:Derivation and Simple Application HU Pili, DL Chapter 2.1 & 2.2 &2.3 
Wed 13/9Linear modelslecture_2  
Fri 15/9Optimizationlecture_3Convex Optimization Boyd and Vandenberghe Chapter 3.1, Numerical Optimization Nocedal and Wright Chapter 3.1 
Wed 20/9Stochastic gradient descent and its variantslecture_4 Written_hw1 out
Fri 22/9Support Vector Machine, Polynomail nonlinear mapping, Kernel method,lecture_5Stanford CS 229 notes 
Wed 27/9Polynomail nonlinear mapping, Kernel methodlecture_6Stanford CS 229 notes 
Fri 29/9Learning theorylecture_7Symmetrization 
Wed 4/10Uniform convergence, growth functionlecture_8Bias/variance tradef offProgramming_HW1 out
Fri 6/10VC Dimensionlecture_9  
Wed 11/10Regularizationlecture_10  
Fri 13/10Tree-based methodslecture_11Xgboost 
Wed 18/10Neural networkslecture_12  
Fri 20/10Neural networks for computer vision, Dropout, Batch Norm, ResNetlecture_13 Written_hw2 out
Wed 25/10Word embedding, RNN, LSTMlecture_14  
Fri 27/10Transformerlecture_15  
Wed 1/11NLP Pretraining, promptlecture_16  
Fri 3/11Clusteringlecture_17 Programming_HW2 out
Wed 8/11Limitations of deep learning: adversarial machine learninglecture_18  
Fri 10/11Semi-supervised learning, graph convolution networklecture_19Graph laplacians 
Wed 15/11Reinforcement Learninglecture_20David Silver’s lecture 
Fri 17/11AutoML(Neural architecture search)lecture_21 Homework3 out
Wed 22/11Reviewlecture_22  
Fri 24/11Final project presentation-part 1   
Wed 29/11Final project presentation-part 2   

References

There is no required textbook for this course. Some recommended readings are