Basic GANs with variety of loss functions as an exercise for my Thesis with Prof. Randy Paffenroth. KL, Reverse-KL, JS and Wasserstein GAN.
-
Updated
Mar 3, 2018 - Jupyter Notebook
Basic GANs with variety of loss functions as an exercise for my Thesis with Prof. Randy Paffenroth. KL, Reverse-KL, JS and Wasserstein GAN.
Sequential KMeans algorithm implementation
PyTorch implementations of the beta divergence loss.
The repo consists of Statistics Algorithms
TATTER (Two-sAmple TesT EstimatoR) is a tool to perform two-sample hypothesis test.
[Python] Comparison of empirical probability distributions. Integral probability metrics (e.g. Kantorovich metric). f-divergences (e.g. Kullback-Leibler). Application to the Choquet integral.
Giant Language Model Test Room, most up to date
Kullback-Leibler divergence in Python
Building a corpus whose unit distribution is approximately the same as a given target distribution by using a greedy algorithm with the Kullback-Leibler divergence. Can be used for Text-To-Speech synthesis application.
Can we identify key events in a war by analyzing raw text from news stories?
A machine learning model to classify workouts performed in videos and predict an effectiveness metric for evaluating the performed workout
Neural Networks, Deep Learning, Computer Vision, Natural Language Processing, Python
💫 Fast Julia implementation of various Kullback-Leibler divergences for 1D parametric distributions. 🏋 Also provides optimized code for kl-UCB indexes
Basic study of information theoretic measures and stochastic processes.
Social Choice Mechanisms for Recommender Systems. A report of the work is included that describes the project
Particle Filter tracker and square-shape detection
This repository contains supplementary files for the paper "Sample Size Determination: Posterior Distributions Proximity"
[CVPR 2023] Modeling Inter-Class and Intra-Class Constraints in Novel Class Discovery
Add a description, image, and links to the kullback-leibler-divergence topic page so that developers can more easily learn about it.
To associate your repository with the kullback-leibler-divergence topic, visit your repo's landing page and select "manage topics."