Skip to content

Repository for the code of the "Introduction to Machine Learning" (IML) lecture at the "Learning & Adaptive Systems Group" at ETH Zurich.

Notifications You must be signed in to change notification settings

kyomangold/ETH-MachineLearning

Repository files navigation

Introduction to Machine Learning

This is the repository for the code files of the Introduction to Machine Learning lecture at the Learning and Adaptive Systems Group at ETH Zurich taught by Prof. Dr. Andreas Krause and Prof. Dr. Fan Yang. The lecture covered the following topics

  • Linear regression (overfitting, cross-validation/bootstrap, model selection, regularization, [stochastic] gradient descent)
  • Linear classification: Logistic regression (feature selection, sparsity, multi-class)
  • Kernels and the kernel trick (Properties of kernels; applications to linear and logistic regression); k-nearest neighbor
  • Neural networks (backpropagation, regularization, convolutional neural networks)
  • Unsupervised learning (k-means, PCA, neural network autoencoders)
  • The statistical perspective (regularization as prior; loss as likelihood; learning as MAP inference)
  • Statistical decision theory (decision making based on statistical models and utility functions)
  • Discriminative vs. generative modeling (benefits and challenges in modeling joint vy. conditional distributions)
  • Bayes' classifiers (Naive Bayes, Gaussian Bayes; MLE)
  • Bayesian approaches to unsupervised learning (Gaussian mixtures, EM)

The repository contains a folder with the code to the compulsory capstone projects, where all of the theoretical knowledge of the lecture had to be applied. The code of all projects passed both the public and private performance baseline and resulted in the maximum grade of 6.0/6.0.