## Overview

1. Crash courses on graph signal processing
2. ... and on information theory
3. Graph entropy as a global and local measure
4. GRAPH UNCERTAINTY PRINCIPLE!
5. Applications and future work

### What is Graph Signal Processing?

What's wrong with discrete time signal processing?

1. Only works on regular domains
2. Dichotomy: functional vs probabilistic
3. Understand jokes like "Fourier only plays with his imaginary friend." "Don't worry - it's just a phase."

Shift!

Shift!

## GSP 101

Shift!

Graph Fourier transform

== Jordan decomposition?

Classical discrete time signal

Cycle graph

## IT 101

We are running short on time

Uniform distribution → high entropy

Gaussian distribution → lower entropy

## Graph Entropy (Motivation)

• A relaxed prefix-free coding problem
• Each vertex represents a symbol in the alphabet
• Two vertices are distinguishable iff there exists a path from one to another
• (Signal persepctive: whether or not one node can influence another)

NP to compute!

## Graph Entropy (Definition 3)

• X is over vertices of G
• Y is over power set of the vertices of G
• X ~ p
• P(X ∈ Y) = 1

## PTIME Graph Entropy

1. Separate graph into connected components
2. For each connected component, find strongly connected components.
3. Take the determinant of the cofactor of each vertex of the root strongly connected component.
4. Compute entropy of each connected component
5. Sum to get overall entropy

## Potential Applications

• Better network design
• Better filter bank design
• Signal recovery from nodal frequency observation
• Finding the counterpart of Gaussian distribution in other graphs
• Generalizing the connection between Hilbert space and signal processing

That was a lot of information at a time.

Here's a fluffy dog.

Good luck with finals :)