Instructor : Cem Subakan
This class is about learning to build machine learning algorithms for signals. Different from a standard machine learning class, we will have a little bit more of an EE flavor to things. That is, we will often work with sequential data such as speech and audio, and other signals. We will give the necessary background to be able to propose and carry out a research or applied project in the domain of machine learning for signal processing. In the end, our goal is to teach how to fish (for MLSP projects)!
This class is influenced by classes with the same title in UIUC, CMU, and Indiana University.
Week 1 : Linear Algebra Refresher slides
Matrix multiplication
Index, Matrix, Tensor notations
Eigenvalues, Eigenvectors
Building the reflexes to avoid for loops
Signal representations
Tensors, Funky Tensor Mathematics
Linear Algebraic Matrix Decompositions
Week 2 : Probability Refresher slides
Probability Calculus, Bayes Rule
Continuous and Discrete Random Variables
Multidimensional Random Variables
Probabilistic Graphical Model Conventions
Week 3: Signal Processing Refresher slides
Continuous and Discrete Signals
Sampling, Analog to Digital Conversion
Fourier Transform, Discrete-Cosine Transform, Short Time Fourier Transform
Filtering
Mechnanics of Convolution in Time Domain, Convolution as a Matrix Multiply
Week 4: Machine Learning 1: Decompositions slides
Linear Regression
Linear Regression connections with Fourier Transform
Dimensionality Reduction, PCA and its variants, ICA, NMF
Week 5: Machine Learning 2: Non-linear Dimensionality Reduction slides
Kernel PCA
Multidimensional Scaling
Manifold Learning Methods
ISOMAP
Locally Linear Embeddings
Laplacian Eigenmaps
TSNE
Week 6: Machine Learning 3: Classification slides
Generative Classification
Discriminative Classification
Perceptron Algorithm
Logistic Regression
Kernel Logistic Regression
Neural Network Classifier
Week 7: Deep Learning Primer slides
Feedforward Networks
Skip Connections
Convolutional Layers
Recurrent Layers
Attention Layers
Gradient Descent and variants
Week 8: Machine Learning 4: Clustering slides
Kmeans clustering
Mixture Models
Expectation Maximization, Iterative Conditional Modes
Spectral Clustering
Hierarchical Clustering
Week 9: Time Series Models slides
Dynamic Time Warping
Hidden Markov Models, Forward-Backward Algorithm
EM for HMMs
Viterbi Decoding
HMM Variants (Mixture of HMMs, Factorial HMMs,…)
Week 10: Graph Signal Processing / Graph ML slides
Signals as Graphs
Graph Fourier Transform
Graph Methods for Signal Processing
Graph Convolutions
Graph Neural Networks
Week 11: Speech / Audio slides
Automatic Speech Recognition (ASR)
Text-to-Speech
Speech Separation / Enhancement
Interpretability in the Audio Domain
Text-Audio Multi-Modal Representations
Week 12-13: Project Presentations
There will be 3 homeworks (45%), labs (10%) and a project (45%) that will be carried out by teams of 2-3 students. It is preferable that the students propose the project, but we will propose several projects ideas also.