Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature: "attention entropy" #387

Open
kahaaga opened this issue Jan 15, 2024 · 0 comments
Open

Feature: "attention entropy" #387

kahaaga opened this issue Jan 15, 2024 · 0 comments

Comments

@kahaaga
Copy link
Member

kahaaga commented Jan 15, 2024

The "attention entropy" does essentially the following:

  • Given input time series x, it identifies local minima and maxima in x.
  • Counts the number of steps between local extrema, in some way (max-min, min-max, max-max, min-min; we'd model this by having a parameter that can take on these four values).
  • Construct a new time series y which consists of the number of steps between extrema (so x is drastically shortened in most cases)
  • Use probabilities(::UniqueElements, y) to get probabilities
  • Plug these probabilities into the Shannon entropy formula

This can be implemented as an OutcomeSpace. Maybe MotifSpacing is a good name? This method is generalizable to any sort of pattern spacing. It is just a matter of encoding differently. An easy way to do so is just to dispatch on MotifSpacing(::Pattern), where Pattern could be MinMaxSpacing, MaxMinSpacing, MaxMaxSpacing, MeanMeanSpacing, MedianMedianSpacing, MedianQuantileSpacing, etc.

It will not be straight-forward to decode/encode. However, codify can be implemented: it simply returns the encoded time series y.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant