Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

zernike.opd_expand_nonorthonormal and zernike.opd_expand_segments give different results for the same inputs #263

Open
kjbrooks opened this issue Nov 6, 2018 · 6 comments

Comments

@kjbrooks
Copy link
Collaborator

kjbrooks commented Nov 6, 2018

In trying to reconstruct a JWST OPD, I found that using the new zernike.opd_expand_segments, I was getting strange results for segments C3 and C4 (behind the diagonal struts) that did not match the reconstruction that I was getting when I used zernike.opd_expand_nonorthonormal previously. My understanding is that zernike.opd_expand_segments should not give different results, just be more efficient. If this understanding is incorrect, please let me know.

I have two functions that only differ in using opd_expand_nonorthonormal or opd_expand_segments:

def fit_terms_to_surface(self, aperture, nterms, ap_centroid, ap_radius, iterations=15):
        '''
        Fit WSS Hexikes to a defined surface
        '''
        def get_basis(*args, **kwargs):
            return basis

        rho, theta = self.define_rho_theta(ap_centroid, ap_radius)
        basis = poppy.zernike.hexike_basis_wss(nterms=nterms, npix=self.npix, rho=rho,
                                               theta=theta, aperture=aperture > 0.)
        self.new_coeffs = poppy.zernike.opd_expand_nonorthonormal(self.opd, 
                                                         aperture=aperture > 0., 
                                                         nterms=nterms, 
                                                         basis=get_basis,
                                                         iterations=iterations)

        self.reconstructed = poppy.zernike.opd_from_zernikes(self.new_coeffs, 
                                                             basis=get_basis, 
                                                             aperture=aperture > 0.)
        self.reconstructed[~np.isfinite(self.reconstructed)] = 0.0

and

def fit_terms_to_segments(self, aperture, nterms, ap_centroid, ap_radius, iterations=15):
        '''
        Fit WSS Hexikes to a segment
        '''
        def get_basis(*args, **kwargs):
            return basis

        rho, theta = self.define_rho_theta(ap_centroid, ap_radius)
        basis = poppy.zernike.hexike_basis_wss(nterms=nterms, npix=self.npix, rho=rho,
                                               theta=theta, aperture=aperture > 0.)
        self.new_coeffs = poppy.zernike.opd_expand_segments(self.opd,
                                                   aperture=aperture > 0., 
                                                   nterms=nterms, 
                                                   basis=get_basis, iterations=iterations)

        self.reconstructed = poppy.zernike.opd_from_zernikes(self.new_coeffs, 
                                                             basis=get_basis, 
                                                             aperture=aperture > 0.)
        self.reconstructed[~np.isfinite(self.reconstructed)] = 0.0

These functions give different results. In the below figures, the same stretch is used in both cases.
Using poppy.zernike.opd_expand_nonorthonormal in the function fit_terms_to_surface above:
screen shot 2018-11-06 at 12 47 54 pm
Using poppy.zernike.opd_expand_segments in the function fit_terms_to_segments above:
screen shot 2018-11-06 at 12 53 25 pm

@mperrin
Copy link
Collaborator

mperrin commented Nov 8, 2018

@kjbrooks, could you send me or post here a bit more of the notebook setting up your problem case? In particular the above code samples appear to be using a get_basis function defined elsewhere. I’d like to see if I can find time today or tomorrow to start digging into what’s going on here...

@kjbrooks
Copy link
Collaborator Author

kjbrooks commented Nov 8, 2018

In the case above that you mention, get_basis is defined in the functions I copied above. I have noticed if that function exists outside of these functions, there is some hysteresis. I am working on a simplified notebook now and will attach it when I am done.

@mperrin
Copy link
Collaborator

mperrin commented Nov 8, 2018

Aha, I didn’t read the above code closely enough. But I think there may be a problem in having the basis variable defined after the get_basis function! Try flipping the order of those.

@kjbrooks
Copy link
Collaborator Author

kjbrooks commented Nov 8, 2018

I think I have identified at least part of the issue. In creating a simplified notebook I discovered that my results (while still different between opd_expand_nonorthonormal and opd_expand_segments) were different than when I posted the above result plots. Part of this is notebook issues (read: human error) since I thought I had re-run the cells, but either I forgot, or something else happened. What I have discovered is that the remaining strength in C3 and C4 in the second set of plots above, is due to an iterations problem. I had been keeping the default iterations for opd_expand_segments which is 2, and when creating the above plots, I had changed it to 15 to match what I use for opd_expand_nonorthonormal but that change didn't get propagated so it looked to me that that had not made a difference. It looks to me like the best solution now is to use iterations=15 for opd_expand_segments in order to get get a better solution, however, I do want to reiterate that the results between opd_expand_nonorthonormal and opd_expand_segments are still different. Here is my simplified notebook where I have set the default iterations to the poppy default and have shown a couple of example of different iterations for the decomposition of the same OPD, using both methods.

I have also moved the get_basis function, though that hasn't seemed to make much of a difference.

@mperrin
Copy link
Collaborator

mperrin commented Nov 8, 2018

The default iterations stuff bothers me; I don't know why it needs to be so high. Need to figure out a way to make this a more automatic process, where for instance you give it a desired tolerance and it automatically iterates until it achieves that tolerance. I'll take a look at the notebook, thanks.

@kjbrooks
Copy link
Collaborator Author

kjbrooks commented Nov 8, 2018

And here is grab_opd_info.py since that is necessary to run the notebook.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants