Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Documentation? #6

Open
anriseth opened this issue Sep 15, 2016 · 15 comments
Open

Documentation? #6

anriseth opened this issue Sep 15, 2016 · 15 comments

Comments

@anriseth
Copy link
Collaborator

We should create documentation, or at the very least explain the input and output structure for the linesearch algorithms.

@axsk
Copy link

axsk commented Oct 17, 2017

Maybe for starters just a basic example given a f, x0 and a search direction g?

@anriseth
Copy link
Collaborator Author

Maybe for starters just a basic example given a f, x0 and a search direction g?

That's a good idea. The examples can be copied from the test files. PRs are welcome ;)

@pkofod
Copy link
Member

pkofod commented Oct 17, 2017

I should maybe add, @axsk, that while the functionality in here is certainly useful more generally, the package is a helper package for Optim and NLsolve, so some of the API might seem a bit alien if you're not familiar with the codebases in those packages.

@simonbatzner
Copy link

Any updates on this? It would be great to have documentation on how to use the package outside of Optim and NLsolve.

@anriseth
Copy link
Collaborator Author

anriseth commented May 13, 2018

We are in the middle of transitioning the code to a more "non-Optim-friendly" API. See #90

It should be quite straight-forward to introduce some examples for usage when we are finished with that.

EDIT: Both @pkofod and I are quite busy with other things this month, but I hope to have some time in June to sort out some missing JuliaNLSolvers things.

@pkofod
Copy link
Member

pkofod commented May 14, 2018

I just want to chime in: it's actually the right approach to bump :) We're just super busy, and not many people have actually contributed to this package or have a clear idea where it's going, so you'll either have to help us out, or wait a bit longer. As Asbjørn said, we're both facing somewhat hard constraints and deadlines outside of OSS development, but nothing is abandoned.

@longemen3000
Copy link

i want to contribute, if its possible, what can i do?

@anriseth
Copy link
Collaborator Author

Hi @longemen3000, thank you for the interest in helping out :)

NB: I have little knowledge of the current needs of the JuliaNLSolvers group, @pkofod can probably direct you more efficiently in the best direction

Are you particularly interested in contributing to the documentation, or more generally?

If documentation:

  • explanations to things you've found difficult when you first a using the package
  • example usages of the package, which potentially can be written with Literate.jl (see the example from Provide usage example with custom optimization algorithms #109)
  • improved docstrings, for example explaining the parameters in some of the linesearches

If any type of contribution:

@longemen3000
Copy link

i forked the package and i'm standarizing the code (changing convert(T,0) to zero(T), convert(T,1) to one(T), etc. the next step is standarizing the interpolation functions used in the package (quadratic,, etc), an trying to eliminate repeated code

@pkofod
Copy link
Member

pkofod commented Sep 11, 2019

i forked the package and i'm standarizing the code (changing convert(T,0) to zero(T), convert(T,1) to one(T), etc. the next step is standarizing the interpolation functions used in the package (quadratic,, etc), an trying to eliminate repeated code

Looking forward to seeing what you come up with, but keep in mind that someone has to read the diffs/change, so it's often much easier to get something merged if you stick to one specific change at a time. I'd rather have one for the converts, one for the quadratic roots, etc, than one giant PR that has many great changes but are difficult to isolate and review.

@pkofod
Copy link
Member

pkofod commented Sep 12, 2019

i forked the package and i'm standarizing the code (changing convert(T,0) to zero(T), convert(T,1) to one(T), etc. the next step is standarizing the interpolation functions used in the package (quadratic,, etc), an trying to eliminate repeated code

Hi @longemen3000 If you're working on this, please reach out to me on slack, discourse or if you can find my e-mail. I want to make sure you're not doing anything that will be voided by near-future changes to all of JuliaNLSolvers.

@baggepinnen
Copy link

I'm trying to figure out how to set a maximum step length when optimizing with Optim, but it's quite hard. The keyword arguments to the different line-search methods do not seem to be documented, and poking around in the code I sometimes find αmax and sometimes alphamax etc.
Example

@with_kw struct HagerZhang{T, Tm}
...
   alphamax::T = Inf
@with_kw struct InitialHagerZhang{T}
...
    αmax::T        = Inf

Should I set one of those?

@anriseth
Copy link
Collaborator Author

Hi @baggepinnen, is this to use with Optim or with a different package?

The BackTracking line search will only decrease the step size from the initial guess you pass it. If your optimization problem is properly scaled, you can use backtracking with a static initial step length = 1 (or some other step length that you determine to ensure you don't exceed your maximum step length).
If you have to use HagerZhang, then set alphamax.

The initial step length functionality is a pre-process step, you can find out how it's used in the Optim source code. If you need to use the InitialHagerZhang procedure to decide the initial step length, then you should also set αmax.

Disclaimer: I haven't used this package / Julia for nearly two years.

@pkofod
Copy link
Member

pkofod commented Jun 11, 2020

Should I set one of those?

There's no uniform interface across all options here I'm afraid, but yes I think alphamax is what you want. But to repeat Asbjørn's question, what are you doing more specifically?

Disclaimer: I haven't used this package / Julia for nearly two years.

But you're still faster than me on wrt help desk tasks. I like it :)

@RossBoylan
Copy link

My contribution to the documentation is to explain what I don't get.

It looks as if the central operation is res = (ls())(ϕ, dϕ, ϕdϕ, α0, ϕ0,dϕ0). I would expect there to be API documentation and, ideally, introductory material, explaining what each of the right-hand side arguments are, including the signatures and the semantics of those functions, and what res is. In a pinch it might refer to the Optim documentation, which I suppose is where the argument naming conventions and semantic expectations come from.

Among the semantic questions is whether the search is seeking to minimize or maximize the objective function (or is it looking for a zero?).

Some discussion of automatic differentiation would also be helpful; again, maybe a reference to Optim in a pinch.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

7 participants