Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add interp_scale to parameter scaling #4218

Open
wants to merge 8 commits into
base: main
Choose a base branch
from

Conversation

QRemy
Copy link
Contributor

@QRemy QRemy commented Nov 25, 2022

Use the interpolation scale given by Parameter.interp in the transform from the parameter value to the factor seen by the optimizer, as proposed in #4027.
Not sure how to combine this with the scale method, for now I apply interp_scale first and then the scale method.

@QRemy QRemy added the feature label Nov 25, 2022
@QRemy QRemy marked this pull request as ready for review December 8, 2022 15:01
@adonath
Copy link
Member

adonath commented Dec 9, 2022

Thanks @QRemy! Should deprecate the "old scaling" methods? @registerrier

@QRemy
Copy link
Contributor Author

QRemy commented Dec 9, 2022

For now they are still applied after the scaling from the .interp argument, not sure if it makes sense. I would prefer to keep them even if they are not applied in a row but alternatively, so we can still try to reproduce previous results.

@adonath
Copy link
Member

adonath commented Jan 25, 2023

@QRemy I think we never really discussed this PR in our dev call, would you mind setting it on the agenda for this Friday? https://github.com/gammapy/gammapy-meetings/tree/master/dev-meetings/2023/2023-01-27 Thanks!

@QRemy
Copy link
Contributor Author

QRemy commented Feb 8, 2023

For backward compatibility I added a scale_interp="lin" argument instead of using direclty interp so the transform is done correctly for previously written models.

@codecov
Copy link

codecov bot commented Feb 8, 2023

Codecov Report

Attention: Patch coverage is 97.50000% with 1 lines in your changes are missing coverage. Please review.

Project coverage is 94.82%. Comparing base (02c45f9) to head (34ed762).
Report is 874 commits behind head on main.

Current head 34ed762 differs from pull request most recent head 34ced3a

Please upload reports for the commit 34ced3a to get more accurate results.

Files Patch % Lines
gammapy/modeling/parameter.py 97.50% 1 Missing ⚠️
Additional details and impacted files
@@             Coverage Diff             @@
##             main    #4218       +/-   ##
===========================================
+ Coverage   75.71%   94.82%   +19.10%     
===========================================
  Files         228      215       -13     
  Lines       33778    30361     -3417     
===========================================
+ Hits        25576    28790     +3214     
+ Misses       8202     1571     -6631     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

gammapy/modeling/parameter.py Outdated Show resolved Hide resolved
@registerrier registerrier added this to the 1.1 milestone Apr 21, 2023
@adonath adonath modified the milestones: 1.1, 1.2 Apr 28, 2023
@adonath
Copy link
Member

adonath commented Apr 28, 2023

Two things:

  • e.g. log scaling is and implicit prior and we are not sure how this relates to PIG 16 - Model Priors API #4381
  • we would need propagation of error through the transform via transform of random variables...

@bkhelifi
Copy link
Member

@QRemy what is the status of this work?

gammapy/modeling/parameter.py Outdated Show resolved Hide resolved
Signed-off-by: Quentin Remy <quentin.remy@mpi-hd.mpg.de>
Signed-off-by: Quentin Remy <quentin.remy@mpi-hd.mpg.de>
Signed-off-by: Quentin Remy <quentin.remy@mpi-hd.mpg.de>
Signed-off-by: Quentin Remy <quentin.remy@mpi-hd.mpg.de>
Signed-off-by: Quentin Remy <quentin.remy@mpi-hd.mpg.de>
QRemy and others added 2 commits December 20, 2023 18:05
Signed-off-by: Quentin Remy <quentin.remy@mpi-hd.mpg.de>
@registerrier registerrier modified the milestones: 1.2, 1.3 Feb 2, 2024
@Astro-Kirsty
Copy link
Member

@QRemy, what is the status here?
Once rebased, ready for review?

@QRemy
Copy link
Contributor Author

QRemy commented May 7, 2024

The error propagation for the transform is still missing.

Copy link
Member

@bkhelifi bkhelifi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks a lot @QRemy. It seems more coherent now. I have just put some suggestions on the docstrings

@@ -97,6 +97,8 @@ class Parameter:
Whether the parameter represents the flux norm of the model.
prior : `~gammapy.modeling.models.Prior`
Prior set on the parameter.
scale_interp : {"lin", "sqrt", "log"}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you profit of this PR to add in the docstring all the default values? Thanks

gammapy/modeling/parameter.py Outdated Show resolved Hide resolved
gammapy/modeling/parameter.py Outdated Show resolved Hide resolved
self.factor, self.scale = 1, self.value
self._scale = value

def transform(self, value, update_scale=False):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What does exactly this function? Do not forget the default value of the argument

self.update_scale(factor)
return factor / self.scale

def inverse_transform(self, factor):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What does exactly this function? A docstring might be great

gammapy/modeling/parameter.py Outdated Show resolved Hide resolved
@@ -824,12 +859,16 @@ def __init__(
min=np.nan,
max=np.nan,
error=0,
scale_method="scale10",
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

AS this class is exposed, it might be great to add a docstring describing it with its parameters (and default values)

Co-authored-by: Bruno Khélifi <khelifi@in2p3.fr>
@QRemy
Copy link
Contributor Author

QRemy commented May 22, 2024

The error propagation for the transform is still missing.

I will try a solution based on
https://hdembinski.github.io/jacobi/reference.html#jacobi.propagate
There is a discussion about integrating it in scipy so we may be able to get rid of this extra dependency later.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants