Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Write a test for PModel environment to check for any abrupt outputs #191

Open
surbhigoel77 opened this issue Mar 7, 2024 · 1 comment
Open
Assignees

Comments

@surbhigoel77
Copy link
Collaborator

surbhigoel77 commented Mar 7, 2024

Fixed in #153

The PModel environment module takes in environmental forcing variables (tc, patm, co2, vpd) and produce photosynthesis variables (ca, kmm, gammastar, ns_star). There are hard bounds on the forcing variables values that should ideally not generate any photosynthsis variables that are out of bound within their own ranges.

We need to check if there are any cases where the valid forcing variables produce any out-of-bound outputs.

@surbhigoel77 surbhigoel77 self-assigned this Mar 7, 2024
@surbhigoel77 surbhigoel77 mentioned this issue Mar 22, 2024
21 tasks
@davidorme
Copy link
Collaborator

The co2 to ca conversion is very straightforward, then kmm, gammastar and ns_star are complex functions of tc and patm only. vpd is passed straight through to the OptimalChiABC subclasses. So using np.meshgrid to get some inputs for combinations of tc and patm should do it. I think it's very likely that the random values of the forcing variables within bounds selected in the benchmarking data would fill that grid space pretty well, but this would be more explicit.

Does that sound right?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Development

No branches or pull requests

2 participants