Skip to content

djpasseyjr/TemporalMemoryResearch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

TemporalMemoryResearch

A repository containing research aiming to understand the fundamental mechanisms of Numenta's temporal memory algorithm.

Jupic

This repository contains a Julia implementation of Numenta's temporal memory algorithm, as described in pseudo-code here. Some minor departures or ambiguities are described in the comments.

To make use of this implementation clone this repo or download jupic.jl. Include the file in the first line of your script via

include("/path/to/jupic.jl")

In this script, you can then initialize a temporal memory model as follows:

num_cols = 512
cells_per_col = 32

TempMem(
    num_cols,
    cells_per_col;

    # Optional args
    activation_threshold=15,
    initial_permanence=0.3,
    connected_permanence=0.5,
    learning_threshold=12,
    learning_enabled=true,
    permanence_increment=0.05,
    permanence_decrement=0.001,
    predicted_decrement=0.05,
    synapse_sample_size=50,
    initial_segments_per_cell=0
)

The temporal memory model accepts indexes of active columns as input. To train on a sequence of characters, we need to associate each character with a random subset of columns. The encode function provides an easy way to do this:

encoding_size = 16
sequence = repeat("xyz", 100)

# A dictionary assosicating each unqiue character to 16 random columns
encodings = encode(sequence, num_cols, encoding_size)

To train the temporal memory model, simply expose it to the characters in order.

epochs = 10
for i in 1:epochs
    for char in sequence
        update!(tm, encodings[char])
    end
end

Use the function predicted_columns(tm) to check what the algorithm expects to see next.

If training is successful,

length(intersect(predicted_columns(tm), encodings['x'])) == encoding_size

Notebooks

This repository also contains jupyter notebooks that document our exploration of the temporal memory algorithm. They are as follows:

  1. LearningJuliaHTM.ipynb Exploring the Julia package HierarchicalTemporalMemory which was hard to use because its unicode variable names didn't render.

  2. GhosalHTM.ipynb Learning about Dipak Ghosal's Python implementation of the temporal memory algorithm.

  3. TMAsSimpleRandomProj.ipynb Investigating the idea that the temporal memory algorithm is essentially a random projection with optimized decision thresholds.

  4. TMThreshExplore.ipynb Understanding the distributions and how the temporal memory network might perform a random projection.

  5. TMThreshModel.ipynb Creation of a model that focuses directly on random projection and attempts to optimize the threshold rather than the synapse permanences.

  6. TMThresholdOpt.ipynb Optimizing the thresholding algorithm for maximum speed.

  7. TMThreshTrain.ipynb Training the threshold algorithm and failing.

  8. TMThreshTrainLonger.ipynb Training the threshold algorithm for longer and still failing. Deciding that this algorithm doesn't work.

  9. JuliaHTMRunTests.ipynb Attempting to understand HierarchicalTemporalMemory by taking apart and running its test suite.

  10. JuliaNumentaTemporalMemory.ipynb Implementing the Nupic pseudo code myself.

  11. GhosalHTMExperiments.ipynb Learning how Dipak Ghosal's code performs synapse reconnections.

  12. HigherOrderSeq.ipynb Studying how the temporal memory algorithm captures context.

  13. HigherOrderSeq2.ipynb Hypothesizing that the temporal memory model is like a massive markov chain with many possible states corresponding to each character.

  14. RepresentationCapacity.ipynb Trying to find the limit to the number of representation for a character that the algorithm can store. Did not converge and the number of synapses continued to grow throughout training.

About

Exploring the Numenta temporal memory algorithm and a Julia implementation of the Nupic version

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages