Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Store ParticleState weights as float array #780

Open
mattbrown11 opened this issue Mar 17, 2023 · 1 comment
Open

Store ParticleState weights as float array #780

mattbrown11 opened this issue Mar 17, 2023 · 1 comment

Comments

@mattbrown11
Copy link
Contributor

Following from a Gitter conversation, I'm prototyping a particle filter tracker with a large number of particles, but I'm noticing that various components of the process are unexpectedly slow. I tracked it down to operations on an array of Probability objects, e.g., sum of the particle Probability weights, being 1-2 orders of magnitude slower than an equivalent array of floats. For now, I'm making versions of components (e.g., predictor/updater) that keep the weight probability representations as floats (implied direct probability, so no support for log=True). But, I'm wondering what the best approach would be to maintain compatibility with the greater Stone Soup system while still being reasonably fast.

I am using main from a few weeks ago. In this line

new_weight = np.asfarray(np.log(predicted_state.weight)) + measurement_model.logpdf(
the operation np.log(predicted_state.weight) takes 45 ms when predicted_state.weight is an array of 100,000 Probability objects. But, if you cast this array to float64, np.log(predicted_state.weight) takes 159 us, 283 times faster. I see the same large difference in speed play out elsewhere when calculating the sum of that array.

One option is, instead of storing ParticleState weight as an array of Probability objects, you could define a ParticleWeight. A rough sketch being:

class ParticleWeight(Type):
    """
    Weight(s) associated with particle(s).
    """

    def log(self):
        """Sequence of individual :class:`~.Particle` objects."""
        return np.log(self.weight)

    def sum(self):
        """Sequence of individual :class:`~.Particle` objects."""
        return np.sum(self.weight)

    def logsum(self):
        """Sequence of individual :class:`~.Particle` objects."""
        return np.sum(self.log)

    @classmethod
    def from_log(cls, log_weight):
        weight = np.exp(log_weight)
        self = cls(weight)
        self.__dict__['log'] = log_weight
        return

I'm still trying to understand the machinery of Base and clearable_cached_property in terms of what I want to do. But, you'd want the ability to create ParticleWeight from a float array representing probabilities or an array representing log probabilities without any redundant calculations. I didn't show it above, but you would want to be able to on-demand calculate the probabilities from the stored log(probabilities) or vice versa, and those conversions be cached. Unclear how to use existing machinery to define such a circular dependence caching wise.

Alternatively, you could add a log_weight field directly to ParticleState object, but I'm not sure if that is cleaner, and it might make the caching structure uglier.

@mattbrown11
Copy link
Contributor Author

I'm taking a look at the recent branch https://github.com/dstl/Stone-Soup/tree/log_weight_particle_state as a potential solution.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant