The repo consists of Statistics Algorithms
-
Updated
Apr 21, 2017
The repo consists of Statistics Algorithms
Basic GANs with variety of loss functions as an exercise for my Thesis with Prof. Randy Paffenroth. KL, Reverse-KL, JS and Wasserstein GAN.
Non-Negative Matrix Factorization for Gene Expression Clustering
🐍 🔬 Fast Python implementation of various Kullback-Leibler divergences for 1D and 2D parametric distributions. Also provides optimized code for kl-UCB indexes
💫 Fast Julia implementation of various Kullback-Leibler divergences for 1D parametric distributions. 🏋 Also provides optimized code for kl-UCB indexes
Basic study of information theoretic measures and stochastic processes.
Particle Filter tracker and square-shape detection
Sequential KMeans algorithm implementation
A machine learning model to classify workouts performed in videos and predict an effectiveness metric for evaluating the performed workout
Giant Language Model Test Room, most up to date
Code, data, and tutorials for "Sense organ control in moths to moles is a gamble on information through motion"
Code for Variable Selection in Black Box Methods with RelATive cEntrality (RATE) Measures
NLP implementations like information-theoretic measures of distributional similarity, text preprocessing using shell commands, Naive Bayes text categorization model, Cocke-Younger-Kasami parsing.
Building a corpus whose unit distribution is approximately the same as a given target distribution by using a greedy algorithm with the Kullback-Leibler divergence. Can be used for Text-To-Speech synthesis application.
Trending algorithm based on the article "Trending at Instagram"
Using entities from NER on GOV.UK content to power personalisation.
Neural Networks, Deep Learning, Computer Vision, Natural Language Processing, Python
Experiments of the three PPO-Algorithms (PPO, clipped PPO, PPO with KL-penalty) proposed by John Schulman et al. on the 'Cartpole-v1' environment.
This project implements in Python some common statistical analysis methods used in data analysis, including Entropy, Mutual Information, Kolmogorov–Smirnov test, Kullback-Leibler divergence (KLD), AB tests (Mann-Whitney U and t-tests)
PyTorch implementations of the beta divergence loss.
Add a description, image, and links to the kullback-leibler-divergence topic page so that developers can more easily learn about it.
To associate your repository with the kullback-leibler-divergence topic, visit your repo's landing page and select "manage topics."