skip to page content SBU
Carnegie Mellon University
95-828 Machine Learning for Problem Solving
Spring 2026

Home
Syllabus
Assignments
Notes

Course Syllabus (download as pdf)


LECTURES:

I will provide slides for the lectures. Those will be uploaded to Canvas before each lecture. Feel free to print them and bring them to class with you for annotating.

I will also assign optional readings from various book chapters per lecture. While those are optional, they are highly recommended for deepening your understanding. The recommended books are listed under Resources (see left tab). To stay on track, you are advised to attend all lectures, read the assigned book chapters (optional), and follow up with questions in lectures, office hours, recitations, and/or Piazza.

RECITATIONS:

There will be a recitation session held by one of the TAs every Friday (unless informed otherwise). The recitation will review the week's material and answer any questions you might have about the course material, including homework.

PART I. ML Foundations & Classical Supervised Models

Week 1: Intro to Machine Learning & Linear Regression

  • Definition, types, applications of ML
  • ML workflow
  • Data preprocessing & feature engineering
  • Linear regression
Recommended Reading:

Weeks 2 - 3: Data Processing & Logistic Regression

  • EDA and feature engineering
  • Logistic regression
  • Regression vs. classification

Weeks 4 - 5: Model Selection & Non-parametric Models

  • Bias-variance tradeoff
  • Metrics, train/test split, cross-validation
  • Hyperparameter optimization
  • kNN and variants

Weeks 6 - 7: Decision Trees & Ensemble Methods

  • Decision trees (classification & regression)
  • Random forests
  • Boosted trees (XGBoost, LightGBM)

Spring Break (no classes, no recitation)

PART II. Deep Learning, Model Concepts & Learning Paradigms

Weeks 8 - 9: Neural Networks & Modern Architectures

  • Neural networks (tabular)
  • Regularization
  • CNNs, RNNs

Week 10: Memory, Attention & Generative Models

  • LSTM
  • Attention & Transformers
  • Diffusion models

Week 11: Important Model Concepts & Learning Paradigms

  • Uncertainty estimation & calibration
  • Interpretability, robustness, fairness
  • Learning paradigms: supervised, semi/self-supervised
  • Active, reinforcement, meta, continual, federated, transfer learning

Spring Carnival (no Thu class, no Fri recitation)

Week 12: Unsupervised Learning

  • Clustering
  • Anomaly detection
  • Dimensionality reduction

PART III. ML on Specialized Data Modalities

Week 13: Tabular & Graph ML

  • Tabular ML & foundation models (TabPFN)
  • Outlier detection: kNN, Isolation Forest, DeepSVDD, Autoencoders
  • Graph embeddings, GNNs , transformers
  • Heterogeneous graphs & graph partitioning

Week 14: Temporal, Text & Image ML

  • Temporal models: ARIMA, LSTM, TFT
  • Text ML: LLMs
  • Image ML: classification, detection, segmentation
  • Multimodal learning: CLIP, BLIP-2

Final Exam