Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Comments on JOSSpaper/paper.md #83

Open
zhenwu0728 opened this issue Nov 8, 2021 · 4 comments
Open

Comments on JOSSpaper/paper.md #83

zhenwu0728 opened this issue Nov 8, 2021 · 4 comments

Comments

@zhenwu0728
Copy link
Member

zhenwu0728 commented Nov 8, 2021

Comments for openjournals/joss-reviews#3814

I've worked through "ideal age" and "PO4-POP" examples, and they are well documented and informative.
I only have a few minor comments for the package and the paper.

  • The examples are great, but Binder is not working properly (tried a few times to launch Binder and it crashed while running the model, possibly because of the limit of memory). I suggest the authors may use Pluto.jl and PlutoSliderServer.jl to host examples.
  • A better show() is needed for OCIM2.load(), F_and_∇ₓF, etc. Now it's hard to get any useful information from the output.

For paper.md:

  • L11-15: It might be good to provide one or two examples for both simple and complicated models.
  • L16: It would be good to point out that AIBECS.jl only resolves steady states.
  • L31: It is worth mentioning that minimizing model-observation mismatches may cause overfitting.
  • L73: Taburet_etal_2019 citation is broken.
  • L102-119: I think there are too many citations, only one or two for each kind of model is enough.
@briochemc
Copy link
Member

Thank you @zhenwu0728 for your comments. Sorry for the delayed response!

I have a question regarding the "better show" comment:

  • A better show() is needed for OCIM2.load(), F_and_∇ₓF, etc. Now it's hard to get any useful information from the output.

I am not sure what I should change. Here is my train of thought.

  1. Both OCIM2.load and F_and_∇ₓF are functions, so they have the generic show method, e.g.:

    julia> OCIM2.load
    load (generic function with 1 method)
  2. I guess you did not mean better docstrings since these already exist (shown below) and I guess you would have said something more specific about them?

    help?> OCIM2.load
      load
    
    
      Returns the grid, the transport matrix, and the He fluxes (in that order).
    
      │ Tip
      │
      │  To load the default OCIM2 matrix and grid, do
      │
      │  julia> grd, T = OCIM2.load()
      │
      │
      │  But you can also load the other matrices by specifying which version you   want, e.g.,
      │
      │  julia> grd, T = OCIM2.load(version="KiHIGH_noHe")
      │
      │
      │  See DeVries and Holzer (2019) for more details

    and

    help?> F_and_∇ₓF
    "F_and_∇ₓF" can be typed by F_and_\nabla<tab>\_x<tab>F
    
    search: F_and_∇ₓF f_and_∇ₓf
    
      F, ∇ₓF = state_function_and_Jacobian(Ts, Gs, nb)
    
    
      Returns the state function F and its jacobian, ∇ₓF.
    
      F, ∇ₓF = state_function_and_Jacobian(T, Gs, nb)
    
    
      Returns the state function F and its jacobian, ∇ₓF (with all tracers  transported by single T).
  3. So maybe the issue is the output itself? For grd, T = OCIM2.load(), the output is a tuple containing grd, which has a custom show:

    julia> grd
    OceanGrid of size 91×180×24 (lat×lon×depth)

    and T, which shows as a standard sparse matrix:

    julia> T
    200160×200160 SparseMatrixCSC{Float64, Int64} with 3018260 stored entries:
    ⣿⣿⣿⣦⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
    ⠻⣿⣿⣿⣿⣆⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
    ⠀⠈⠻⢿⣿⣿⣷⣄⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
    ⠀⠀⠀⠀⠙⢿⣿⣿⣷⣄⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
    ⠀⠀⠀⠀⠀⠀⠙⢿⣿⣿⣷⣄⠀⠀⠀⠀⠀⠀⠀⠀
    ⠀⠀⠀⠀⠀⠀⠀⠀⠙⢿⣿⣿⣷⣄⠀⠀⠀⠀⠀⠀
    ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠙⢿⣿⣿⣷⣄⠀⠀⠀⠀
    ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠙⢿⣿⣿⣷⣄⠀⠀
    ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠙⢿⣿⣿⣧⡀
    ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠉⠻⢿⣷

    Thus maybe the issue for OCIM2.load() is actually the extra outputs (which are not always assigned to a variable)? These include He fluxes as unitful vectors (as explained in the docstring that I pasted above). Below I assign all outputs from OCIM2.load() and I am guessing that issue with the output of OCIM2.load() might be theses two unitful vectors which get printed inline inside the tuple of outputs? (See the very long line of output below)

    julia> a, b, c, d = OCIM2.load()
    ┌ Warning: Over-writing registration of the datadep
    │   name = "AIBECS-OCIM2_CTL_He"
    └ @ DataDeps ~/.julia/packages/DataDeps/ooWXe/src/registration.jl:15
    ┌ Info: You are about to use the OCIM2_CTL_He model.
    │ If you use it for research, please cite:
    │
    │ - DeVries, T., & Holzer, M. (2019). Radiocarbon and helium isotope constraints    on deep ocean ventilation and mantle‐³He sources. Journal of Geophysical   Research: Oceans, 124, 30363057. https://doi.org/10.1029/2018JC014716
    │
    │ You can find the corresponding BibTeX entries in the CITATION.bib file
    │ at the root of the AIBECS.jl package repository.
    └ (Look for the "DeVries_Holzer_2019" key.)
    (OceanGrid of size 91×180×24 (lat×lon×depth)
    ,
    ⢿⣷⣄⠀⠀⠀⠀⠀⠀⠀
    ⠀⠙⢿⣷⣄⠀⠀⠀⠀⠀
    ⠀⠀⠀⠙⢿⣷⣄⠀⠀⠀
    ⠀⠀⠀⠀⠀⠙⢿⣷⣄⠀
    ⠀⠀⠀⠀⠀⠀⠀⠙⢿⣷, Quantity{Float64, 𝐍 𝐋⁻³ 𝐓⁻¹, Unitful.FreeUnits{(m⁻³, mol, yr⁻¹),    𝐍 𝐋⁻³ 𝐓⁻¹, nothing}}[1.82368935574118e-11 mol m⁻³ yr⁻¹, 1.9469772194929453e-11     mol m⁻³ yr⁻¹, 3.4879359939844136e-13 mol m⁻³ yr⁻¹, 1.5360041759458074e-12 mol m⁻³   yr⁻¹, 1.47673234957578e-12 mol m⁻³ yr⁻¹, 5.300237184679614e-11 mol m⁻³ yr⁻¹, 8.   595855761423191e-11 mol m⁻³ yr⁻¹, 1.4013789381514675e-10 mol m⁻³ yr⁻¹, 2.  0592425177423245e-10 mol m⁻³ yr⁻¹, 2.3948293487252974e-10 mol m⁻³ yr⁻¹  …  0.0    mol m⁻³ yr⁻¹, 0.0 mol m⁻³ yr⁻¹, 0.0 mol m⁻³ yr⁻¹, 0.0 mol m⁻³ yr⁻¹, 0.0 mol m⁻³    yr⁻¹, 0.0 mol m⁻³ yr⁻¹, 0.0 mol m⁻³ yr⁻¹, 0.0 mol m⁻³ yr⁻¹, 0.0 mol m⁻³ yr⁻¹, 0.0  mol m⁻³ yr⁻¹], Quantity{Float64, 𝐍 𝐋⁻³ 𝐓⁻¹, Unitful.FreeUnits{(m⁻³, mol, yr⁻¹)   , 𝐍 𝐋⁻³ 𝐓⁻¹, nothing}}[1.2487363516886739e-5 mol m⁻³ yr⁻¹, 1.  333155354686199e-5 mol m⁻³ yr⁻¹, 2.388297356860729e-7 mol m⁻³ yr⁻¹, 1.    0517494357308457e-6 mol m⁻³ yr⁻¹, 1.0111641880370838e-6 mol m⁻³ yr⁻¹, 3.    629235880686242e-5 mol m⁻³ yr⁻¹, 5.8858475701302695e-5 mol m⁻³ yr⁻¹, 9. 595673830367888e-5 mol m⁻³ yr⁻¹, 0.0001410026867104604 mol m⁻³ yr⁻¹, 0.  00016398135211074882 mol m⁻³ yr⁻¹  …  0.0 mol m⁻³ yr⁻¹, 0.0 mol m⁻³ yr⁻¹, 0.0 mol     m⁻³ yr⁻¹, 0.0 mol m⁻³ yr⁻¹, 0.0 mol m⁻³ yr⁻¹, 0.0 mol m⁻³ yr⁻¹, 0.0 mol m⁻³ yr⁻¹,   0.0 mol m⁻³ yr⁻¹, 0.0 mol m⁻³ yr⁻¹, 0.0 mol m⁻³ yr⁻¹])
    
    julia> c
    200160-element Vector{Quantity{Float64, 𝐍 𝐋⁻³ 𝐓⁻¹, Unitful.FreeUnits{(m⁻³,  mol, yr⁻¹), 𝐍 𝐋⁻³ 𝐓⁻¹, nothing}}}:
       1.82368935574118e-11 mol m⁻³ yr⁻¹
     1.9469772194929453e-11 mol m⁻³ yr⁻¹
                                       
                        0.0 mol m⁻³ yr⁻¹
                        0.0 mol m⁻³ yr⁻¹
    
    julia> d
    200160-element Vector{Quantity{Float64, 𝐍 𝐋⁻³ 𝐓⁻¹, Unitful.FreeUnits{(m⁻³,  mol, yr⁻¹), 𝐍 𝐋⁻³ 𝐓⁻¹, nothing}}}:
     1.2487363516886739e-5 mol m⁻³ yr⁻¹
      1.333155354686199e-5 mol m⁻³ yr⁻¹
                                      
                       0.0 mol m⁻³ yr⁻¹
                       0.0 mol m⁻³ yr⁻¹
  4. And finally, I am guessing the concerns for F_and_∇ₓF are the same. F and ∇ₓF are SciML functions and borrow from the ecosystem's show.

    julia> F # customized SciML show method
    (::AIBECS.var"#F#372"{SciMLBase.ODEFunction{false, AIBECS.var"#f#73"{Tuple{AIBECS.  var"#51#52"{SparseMatrixCSC{Float64, Int64}}}, Vector{Int64}, AIBECS.var"#G#71"   {Tuple{typeof(G)}, AIBECS.var"#tracers#69"{Int64, Int64}}, AIBECS.var"#tracer#70"  {Int64, Int64}}, LinearAlgebra.UniformScaling{Bool}, Nothing, Nothing, AIBECS.    var"#jac#78"{AIBECS.var"#T#75"{Tuple{AIBECS.var"#51#52"{SparseMatrixCSC{Float64,    Int64}}}, Int64, Vector{Int64}}, AIBECS.var"#∇ₓG#74"{Tuple{typeof(G)}, Int64,  Int64}}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing,     Nothing, typeof(SciMLBase.DEFAULT_OBSERVED), Nothing}}) (generic function with 1    method)
    
    julia> ∇ₓF # customized SciML show method
    (::AIBECS.var"#∇ₓF#371"{SciMLBase.ODEFunction{false, AIBECS.var"#f#73"{Tuple    {AIBECS.var"#51#52"{SparseMatrixCSC{Float64, Int64}}}, Vector{Int64}, AIBECS.   var"#G#71"{Tuple{typeof(G)}, AIBECS.var"#tracers#69"{Int64, Int64}}, AIBECS.   var"#tracer#70"{Int64, Int64}}, LinearAlgebra.UniformScaling{Bool}, Nothing,   Nothing, AIBECS.var"#jac#78"{AIBECS.var"#T#75"{Tuple{AIBECS.var"#51#52"   {SparseMatrixCSC{Float64, Int64}}}, Int64, Vector{Int64}}, AIBECS.var"#∇ₓG#74" {Tuple{typeof(G)}, Int64, Int64}}, Nothing, Nothing, Nothing, Nothing, Nothing,  Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED), Nothing}}    ) (generic function with 1 method)

    It is a quite a bit obfuscated, but I do not think I should replace it with my own. The problem really is when these SciML functions are lumped into a tuple, then theyshow their "insides" and clutter the output:

    julia> F, ∇ₓF # cluttered show inside a tuple
    (AIBECS.var"#F#372"{SciMLBase.ODEFunction{false, AIBECS.var"#f#73"{Tuple{AIBECS.    var"#51#52"{SparseMatrixCSC{Float64, Int64}}}, Vector{Int64}, AIBECS.var"#G#71" {Tuple{typeof(G)}, AIBECS.var"#tracers#69"{Int64, Int64}}, AIBECS.var"#tracer#70"    {Int64, Int64}}, LinearAlgebra.UniformScaling{Bool}, Nothing, Nothing, AIBECS.  var"#jac#78"{AIBECS.var"#T#75"{Tuple{AIBECS.var"#51#52"{SparseMatrixCSC{Float64,  Int64}}}, Int64, Vector{Int64}}, AIBECS.var"#∇ₓG#74"{Tuple{typeof(G)}, Int64,    Int64}}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing,   Nothing, typeof(SciMLBase.DEFAULT_OBSERVED), Nothing}}(SciMLBase.ODEFunction  {false, AIBECS.var"#f#73"{Tuple{AIBECS.var"#51#52"{SparseMatrixCSC{Float64, Int64}    }}, Vector{Int64}, AIBECS.var"#G#71"{Tuple{typeof(G)}, AIBECS.var"#tracers#69"  {Int64, Int64}}, AIBECS.var"#tracer#70"{Int64, Int64}}, LinearAlgebra.    UniformScaling{Bool}, Nothing, Nothing, AIBECS.var"#jac#78"{AIBECS.var"#T#75"   {Tuple{AIBECS.var"#51#52"{SparseMatrixCSC{Float64, Int64}}}, Int64, Vector{Int64}} , AIBECS.var"#∇ₓG#74"{Tuple{typeof(G)}, Int64, Int64}}, Nothing, Nothing,    Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.   DEFAULT_OBSERVED), Nothing}(AIBECS.var"#f#73"{Tuple{AIBECS.var"#51#52" {SparseMatrixCSC{Float64, Int64}}}, Vector{Int64}, AIBECS.var"#G#71"{Tuple{typeof    (G)}, AIBECS.var"#tracers#69"{Int64, Int64}}, AIBECS.var"#tracer#70"{Int64, Int64}  }((AIBECS.var"#51#52"{SparseMatrixCSC{Float64, Int64}}(
    ⢿⣷⣄⠀⠀⠀⠀⠀⠀⠀
    ⠀⠙⢿⣷⣄⠀⠀⠀⠀⠀
    ⠀⠀⠀⠙⢿⣷⣄⠀⠀⠀
    ⠀⠀⠀⠀⠀⠙⢿⣷⣄⠀
    ⠀⠀⠀⠀⠀⠀⠀⠙⢿⣷),), [1], AIBECS.var"#G#71"{Tuple{typeof(G)}, AIBECS.var"#tracers#69" {Int64, Int64}}((G,), AIBECS.var"#tracers#69"{Int64, Int64}(200160, 1)), AIBECS. var"#tracer#70"{Int64, Int64}(200160, 1)), LinearAlgebra.UniformScaling{Bool}    (true), nothing, nothing, AIBECS.var"#jac#78"{AIBECS.var"#T#75"{Tuple{AIBECS.   var"#51#52"{SparseMatrixCSC{Float64, Int64}}}, Int64, Vector{Int64}}, AIBECS.  var"#∇ₓG#74"{Tuple{typeof(G)}, Int64, Int64}}(AIBECS.var"#T#75"{Tuple{AIBECS. var"#51#52"{SparseMatrixCSC{Float64, Int64}}}, Int64, Vector{Int64}}((AIBECS.    var"#51#52"{SparseMatrixCSC{Float64, Int64}}(
    ⢿⣷⣄⠀⠀⠀⠀⠀⠀⠀
    ⠀⠙⢿⣷⣄⠀⠀⠀⠀⠀
    ⠀⠀⠀⠙⢿⣷⣄⠀⠀⠀
    ⠀⠀⠀⠀⠀⠙⢿⣷⣄⠀
    ⠀⠀⠀⠀⠀⠀⠀⠙⢿⣷),), 1, [1]), AIBECS.var"#∇ₓG#74"{Tuple{typeof(G)}, Int64, Int64}((G,),   200160, 1)), nothing, nothing, nothing, nothing, nothing, nothing, nothing,   nothing, nothing, SciMLBase.DEFAULT_OBSERVED, nothing)), AIBECS.var"#∇ₓF#371" {SciMLBase.ODEFunction{false, AIBECS.var"#f#73"{Tuple{AIBECS.var"#51#52" {SparseMatrixCSC{Float64, Int64}}}, Vector{Int64}, AIBECS.var"#G#71"{Tuple{typeof    (G)}, AIBECS.var"#tracers#69"{Int64, Int64}}, AIBECS.var"#tracer#70"{Int64, Int64}  }, LinearAlgebra.UniformScaling{Bool}, Nothing, Nothing, AIBECS.var"#jac#78"  {AIBECS.var"#T#75"{Tuple{AIBECS.var"#51#52"{SparseMatrixCSC{Float64, Int64}}},    Int64, Vector{Int64}}, AIBECS.var"#∇ₓG#74"{Tuple{typeof(G)}, Int64, Int64}},   Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing,  typeof(SciMLBase.DEFAULT_OBSERVED), Nothing}}(SciMLBase.ODEFunction{false, AIBECS.   var"#f#73"{Tuple{AIBECS.var"#51#52"{SparseMatrixCSC{Float64, Int64}}}, Vector  {Int64}, AIBECS.var"#G#71"{Tuple{typeof(G)}, AIBECS.var"#tracers#69"{Int64, Int64}    }, AIBECS.var"#tracer#70"{Int64, Int64}}, LinearAlgebra.UniformScaling{Bool},   Nothing, Nothing, AIBECS.var"#jac#78"{AIBECS.var"#T#75"{Tuple{AIBECS.var"#51#52"  {SparseMatrixCSC{Float64, Int64}}}, Int64, Vector{Int64}}, AIBECS.var"#∇ₓG#74"    {Tuple{typeof(G)}, Int64, Int64}}, Nothing, Nothing, Nothing, Nothing, Nothing,     Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED), Nothing}    (AIBECS.var"#f#73"{Tuple{AIBECS.var"#51#52"{SparseMatrixCSC{Float64, Int64}}},  Vector{Int64}, AIBECS.var"#G#71"{Tuple{typeof(G)}, AIBECS.var"#tracers#69"{Int64,    Int64}}, AIBECS.var"#tracer#70"{Int64, Int64}}((AIBECS.var"#51#52"{SparseMatrixCSC {Float64, Int64}}(
    ⢿⣷⣄⠀⠀⠀⠀⠀⠀⠀
    ⠀⠙⢿⣷⣄⠀⠀⠀⠀⠀
    ⠀⠀⠀⠙⢿⣷⣄⠀⠀⠀
    ⠀⠀⠀⠀⠀⠙⢿⣷⣄⠀
    ⠀⠀⠀⠀⠀⠀⠀⠙⢿⣷),), [1], AIBECS.var"#G#71"{Tuple{typeof(G)}, AIBECS.var"#tracers#69" {Int64, Int64}}((G,), AIBECS.var"#tracers#69"{Int64, Int64}(200160, 1)), AIBECS. var"#tracer#70"{Int64, Int64}(200160, 1)), LinearAlgebra.UniformScaling{Bool}    (true), nothing, nothing, AIBECS.var"#jac#78"{AIBECS.var"#T#75"{Tuple{AIBECS.   var"#51#52"{SparseMatrixCSC{Float64, Int64}}}, Int64, Vector{Int64}}, AIBECS.  var"#∇ₓG#74"{Tuple{typeof(G)}, Int64, Int64}}(AIBECS.var"#T#75"{Tuple{AIBECS. var"#51#52"{SparseMatrixCSC{Float64, Int64}}}, Int64, Vector{Int64}}((AIBECS.    var"#51#52"{SparseMatrixCSC{Float64, Int64}}(
    ⢿⣷⣄⠀⠀⠀⠀⠀⠀⠀
    ⠀⠙⢿⣷⣄⠀⠀⠀⠀⠀
    ⠀⠀⠀⠙⢿⣷⣄⠀⠀⠀
    ⠀⠀⠀⠀⠀⠙⢿⣷⣄⠀
    ⠀⠀⠀⠀⠀⠀⠀⠙⢿⣷),), 1, [1]), AIBECS.var"#∇ₓG#74"{Tuple{typeof(G)}, Int64, Int64}((G,),   200160, 1)), nothing, nothing, nothing, nothing, nothing, nothing, nothing,   nothing, nothing, SciMLBase.DEFAULT_OBSERVED, nothing)))

TLDR: Could you clarify/give more details what you think should be changed in response to the "better show" comment?

@briochemc
Copy link
Member

I gave a bit of thought to the issue with F_and_∇ₓF and think I have a decent solution. There is the AIBECSFunction that one can use instead, which returns a SciMLBase.ODEFunction. I'm writing an update on which one would do

julia> F = AIBECSFunction(T, G)
(::SciMLBase.ODEFunction{false, AIBECS.var"#f#438"{Tuple{AIBECS.var"#51#52"{SparseMatrixCSC{Float64, Int64}}}, Vector{Int64}, AIBECS.var"#G#436"{Tuple{typeof(G)}, AIBECS.var"#tracers#434"{Int64, Int64}}, AIBECS.var"#tracer#435"{Int64, Int64}}, LinearAlgebra.UniformScaling{Bool}, Nothing, Nothing, AIBECS.var"#jac#443"{AIBECS.var"#T#440"{Tuple{AIBECS.var"#51#52"{SparseMatrixCSC{Float64, Int64}}}, Int64, Vector{Int64}}, AIBECS.var"#∇ₓG#439"{Tuple{typeof(G)}, Int64, Int64}}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED), Nothing}) (generic function with 7 methods)

and then this F can be called directly to get the tendencies:

julia> F(x,p)
200160-element Vector{Float64}:
 1.0
 1.0
 
 1.0
 1.0

The Jacobian can be accessed using the SciML syntax:

julia> F(Val{:jac},x,p)
200160×200160 SparseMatrixCSC{Float64, Int64} with 3018260 stored entries:
⢿⣷⣄⠀⠀⠀⠀⠀⠀⠀
⠀⠙⢿⣷⣄⠀⠀⠀⠀⠀
⠀⠀⠀⠙⢿⣷⣄⠀⠀⠀
⠀⠀⠀⠀⠀⠙⢿⣷⣄⠀
⠀⠀⠀⠀⠀⠀⠀⠙⢿⣷

But I don't need to have any discussion about the Jacobian and the solvers in the tutorials. So in that sense, having just F = AIBECSFunction(T, G) and letting the user call F(x,p) seems friendlier. (Of course AIBECS can and will access the Jacobian under the hood to solve the problem without bothering the user with the implementation details.)

The steady-state problem is then set up with only F (which is also simpler, no need to write ∇ₓF at all from a user's point of view:

julia> prob = SteadyStateProblem(F, x, p)
SteadyStateProblem with uType Vector{Float64}. In-place: false
u0: 200160-element Vector{Float64}:
 0.0
 0.0
 
 0.0
 0.0

Would that be better in your opinion?

@briochemc
Copy link
Member

@zhenwu0728 FYI, there is now an ideal-age tutorial deployed from the linked PR which shows how this looks when using AIBECSFunction instead of F_and_∇ₓF. Does that seem like a good solution for the output of F_and_∇ₓF?

@zhenwu0728
Copy link
Member Author

Sorry @briochemc I didn't make it clear enough.
I realized that the outputs(OceanGrid, SparseMatrix etc.) are not the type you defined in AIBECS.jl so you don't have control.

The AIBECSFunction is absolutely better.

But grd, TOCIM2 = OCIM2.load() still doesn't show OceanGrid information in the example from your last comment.

Thus maybe the issue for OCIM2.load() is actually the extra outputs (which are not always assigned to a variable)? These include He fluxes as unitful vectors (as explained in the docstring that I pasted above). Below I assign all outputs from OCIM2.load() and I am guessing that issue with the output of OCIM2.load() might be these two unitful vectors which get printed inline inside the tuple of outputs? (See the very long line of output below)

Yes, if the extra outputs are not used, could they be removed from the return?

@briochemc briochemc reopened this Jun 24, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants