Welcome to our Machine Learning reposetory! Here you will find various projects from polynomial regression to fully-connected neural networks from scratch, SVM and Gaussian Processes! On Part II, we analize Independent Component Analysis, Graphical Models, EM and VAEs!
Polynomial regressions as prediction function, along with the data and the original sine function of various polynomial order.
Bayesian linear regression model Left: Plot of predictive distribution Right: 100 polynomials sampled from the parameter posterior distribution
MNIST: Left: visualisation of the first 8 digits of the trainingset Right: visualization of leanred weights.
Left: Easiest digits for classification. Right: Hardest digits for classification
Weights of the hidden layer at epoch 0, 4 and 9.
Left: Multiclass logistic regression Right: Multilayer perceptron
Gaussian Processes.
Support Vector Machines.
In this assignment, we implement the Independent Component Analysis algorithm, as described in chapter 34 of David MacKay's book "Information Theory, Inference, and Learning Algorithms".
Results of signal reconstruction using different priors and W matrix initialization.
In this assignment, we implement the sum-product and max-sum algorithms for factor graphs over discrete variables. We implemented these algorithms to a medical graph, in order to infer the possible decease.
Medical Directed Graph.
In this assignment, we implement the Expectation Maximization (EM) algorithm and Variational Autoencoder (VAE) on the MNIST dataset of written digits.
VAE's leanred manifold of the MNIST dataset of written digits.
The majority of the projects come from the lab assignments of the Machine Learning 1 and Machine Learning 2 courses of the MSc in Artificial Intelligence at the University of Amsterdam.