Skip to content

mobeets/neural-synth

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

40 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Neural music pad

A variational autoencoder (VAE) is used to generate outputs (X) from a continuous latent space (Z) by learning a generative model p(X | Z). In our case we want to generate music by controlling the position of the user's cursor. So we let X(t) be an 88-d binary vector that specifies which of 88 different notes to play at some time t, and Z(t) is controlled by your mouse.

How it works

This variational autoencoder was trained on 382 of Bach's four-part chorales (source), transposed to Cmaj or Amin. The model was built and trained using keras. Code for this process can be found here.

In the browser, I'm using p5.js to play sound and handle mouse clicks. In the backend, I use the model loaded in keras to generate notes from the position clicked on the pad. I then use music21 to detect which chord those notes correspond to.

Running locally

First, install requirements with pip install -r requirements.txt. Then run python app.py and navigate in your browser to http://0.0.0.0:8080.