Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ch 20 (optimization): a few small issues #165

Open
murphyk opened this issue Jul 7, 2023 · 0 comments
Open

ch 20 (optimization): a few small issues #165

murphyk opened this issue Jul 7, 2023 · 0 comments

Comments

@murphyk
Copy link

murphyk commented Jul 7, 2023

In sec 20.1, The comment about scipy.minimize where you say "we don’t even need to compute the gradient" may be misleading. As you know, by default it uses numerical differentaiton to compute the gradient, if the grad function is not specified by the user, so this is likely to be slow. You may want to mention automatic differentiation libraries like jax and pytorch, which can solve this problem for you. (Also scipy.minimize defaults to BFGS, not GD, and chooses step size automagically :) Since this book is trying to demonstrate "best practice" for DS (eg the nice way you use dataframe.pipe for reproducible wrangling), maybe you should show how to use scipy.minimize on your example problem?

In sec 20.2 first 2 paragraphs need rewriting to avoid repetition/ redundancy.

Screenshot 2023-07-07 at 1 49 47 PM

In sec 20.3 maybe mention that convex implies second order derivative is positive, so the function has a bowl shape.
This condition is easier to check in practice than the definition of convexity. It's probably also worth mentioning some examples of convex and non-convex loss functions encountered in the book.

Maybe mention SAGA and other variance reduced SGD methods since it is used in 21.4.1?

@murphyk murphyk changed the title sec 20.2 scipy.minimize comment ch 20 (optimization): a few small issues Jul 7, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant