Course Overview
This course provides an introduction to machine learning with a special focus on engineering applications. The course starts with a mathematical background required for machine learning and covers approaches for supervised learning (linear models, kernel methods, decision trees, neural networks) and unsupervised learning (clustering, dimensionality reduction), as well as theoretical foundations of machine learning (learning theory, optimization). Evaluation will consist of mathematical problem sets and programming projects targeting realworld engineering applications.
Prerequisites
This course is intended for graduate students and qualified undergraduate students with a strong mathematical and programming background. Undergraduate level training or coursework in algorithms, linear algebra, calculus, probability, and statistics is suggested. A background in programming will also be necessary for the problem sets; students are expected to be familiar with python or learn it during the course. At CMU, this course is most similar to MLD's 10701, though this course is meant specifically for students in engineering.
Textbooks
There will be no required textbooks, though we suggest the following to help you to study (all available online): (KM): Machine Learning: A Probabilistic Perspective, Kevin Murphy. Online access is free through CMU’s library. Note that to access the library, you may need to be on CMU’s network or VPN.
 (ESL): Elements of Statistical Learning Trevor Hastie, Robert Tibshirani and Jerome Friedman.
 (TM): Machine Learning, Tom Mitchell.
 (CIML): A Course in Machine Learning, Hal Daumé III.
Piazza
We will use Piazza for class discussions. Please go to this Piazza website to join the course forum (note: you must use a cmu.edu email account to join the forum). We strongly encourage students to post on this forum rather than emailing the course staff directly (this will be more efficient for both students and staff). Students should use Piazza to:
 Ask clarifying questions about the course material.
 Share useful resources with classmates (so long as they do not contain homework solutions).
 Look for students to form study groups.
 Answer questions posted by other students to solidify your own understanding of the material.
Grading Policy
Grades will be based on the following components:
 Homework (50%): There will be 7 homeworks.
 Late submissions will not be accepted.
 There is one exception to this rule: You are given 3 “late days” (selfgranted 24hr extensions) which you can use to give yourself extra time without penalty. At most one late day can be used per assignment. This will be monitored automatically via Gradescope.
 Solutions will be graded on both correctness and clarity. If you cannot solve a problem completely, you will get more partial credit by identifying the gaps in your argument than by attempting to cover them up.
 Midterm (20%)
 Final (30%)
Staff Contact
TAs:
Jacob Hoffman (Pitt)  jhoffma1@andrew.cmu.edu  
Samarth Gupta (Pitt)  samarthg@andrew.cmu.edu  
Ritwick Chaudhry (Pitt)  rchaudhr@andrew.cmu.edu  
Shreyas Chaudhari (Pitt)  schaudh2@andrew.cmu.edu  
Soham Deshmukh (Pitt)  sdeshmuk@andrew.cmu.edu  
Mike Weber (Pitt)  mweber2@andrew.cmu.edu  
TJ Kim(Pitt)  nrangara@andrew.cmu.edu  
Sweta Hari Kumar (Pitt)  sharikum@andrew.cmu.edu 
Collaboration Policy
Group studying and collaborating on problem sets are encouraged, as working together is a great way to understand new material. Students are free to discuss the homework problems with anyone under the following conditions: Students must write their own solutions and understand the solutions that they wrote down.
 Students must list the names of their collaborators (i.e., anyone with whom the assignment was discussed).
 Students may not use old solution sets from other classes under any circumstances, unless the instructor grants special permission.
Acknowledgments
This course is based in part on material developed by Fei Sha, Ameet Talwalkar, Matt Gormley, and Emily Fox. We also thank Anit Sahu and Joao Saude for their help with course development.
Tentative Schedule
Date  Lecture  Readings  Announcements 

Monday, 13th Jan  Lecture 1 : Intro & Math Quiz [Slides] 


Wed, 15th Jan  Lecture 2 : Probability Review, MLE/MAP [Slides] 


Fri, 17th Jan  Recitation  
Mon, 20th Jan 
No class (MLK Day) 


Wed, 22nd Jan 
Lecture 3 : Linear Algebra Review, Least Squares 


Mon, 27th Jan 
Lecture 4 : Linear Regression, part I 

HW 1 due

Wed, 29th Jan 
Lecture 5 : Linear Regression, part II 

HW 2 released

Mon, 3rd Feb 
Lecture 6 : Overfitting, Bias/variance tradeoff, Evaluation 


Wed, 5th Feb 
Lecture 7 : Naive Bayes / Logistic Regression, part I 


Mon, 10th Feb 
Lecture 8 : Naive Bayes / Logistic Regression, part II 

HW 3 released HW 2 due 
Wed, 12th Feb 
Lecture 9 : Multiclass Classification/Perceptron 


Mon, 17th Feb 
Lecture 10 : SVM, part I 

HW 4 released HW 3 due 
Wed, 19th Feb 
Lecture 11 : SVM, part II 


Mon, 24th Feb 
Lecture 12 : Nearest Neighbors 
HW 4 due


Wed, 26th Feb 
Midterm exam 


Mon, 2nd Mar 
Lecture 14 : Decision Trees 

HW 5 released

Wed, 4th Mar 
Lecture 15 : Boosting, Random Forest 


Mon, 9th Mar 
No Class; Spring Break 


Wed, 11th Mar 
No Class; Spring Break 


Mon, 16th Mar 
Lecture 16 : Neural Networks, Part I 
HW 5 due


Wed, 18th Mar 
Lecture 17 : Neural Networks, Part II 
HW 6 released


Mon, 23rd Mar 
Lecture 18 : Neural Networks, Part III 


Wed, 25th Mar 
Lecture 19 : Clustering, Part I 


Mon, 30th Mar 
Lecture 20 : Clustering, Part II 

HW 7 released

Wed, 1st Apr  Pytorch


HW 6 due

Wed, 6th Apr  Lecture 21 : EM


HW 6 due

Wed, 8th Apr  Lecture 22 : Dimensionality Reduction 


Mon, 13th Apr  Lecture 23 : Online Learning 


Wed, 15th Apr 
Lecture 24 : Reinforcement Learning 

HW 7 one notebook due 
Mon, 20th Apr 
Guest Lecture  Special Topics 


Wed, 22nd Apr 
Guest Lecture  Special Topics 


Mon, 27th Apr 
Lecture 25 : Last Lecture  Final Exam Review 

HW 7 remaining notebooks due

Wed, 29th April 
Final Exam 

