Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Softening prescription for numerical stability might be wrong #47

Open
le-zouave opened this issue Jun 27, 2023 · 0 comments
Open

Softening prescription for numerical stability might be wrong #47

le-zouave opened this issue Jun 27, 2023 · 0 comments
Labels
bug Something isn't working invalid This doesn't seem right

Comments

@le-zouave
Copy link
Collaborator

To avoid dividing by very small numbers, a lot of methods in the lens classes do something like this:

th = (x ** 2 + y ** 2).sqrt() + self.s

where self.s is a softening parameter, with typical default of 0.001. This is different than lenstronomy, which does this instead:

th = np.maximum(th, self.s)

Our current approach shifts all radial coordinates by self.s, not just coordinates close to the lens center which would lead to numerical instabilities. This biases our implementation with respect to lenstronomy by a global rescaling of our coordinates.

I suggest that we use torch.maximum(th, self.s) as a safeguard against very small radii.

@le-zouave le-zouave added bug Something isn't working invalid This doesn't seem right labels Jun 27, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working invalid This doesn't seem right
Projects
None yet
Development

No branches or pull requests

1 participant