Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Different results on Intel vs. AMD with ForwardDiff #636

Open
elbert5770 opened this issue Mar 19, 2023 · 0 comments
Open

Different results on Intel vs. AMD with ForwardDiff #636

elbert5770 opened this issue Mar 19, 2023 · 0 comments

Comments

@elbert5770
Copy link

elbert5770 commented Mar 19, 2023

Hi, the code in the linked repository gives slightly different optimization results on an Intel machine vs an AMD machine when using ForwardDiff. It does not matter if the operating system is Ubuntu or Windows, only the processor. In the example, the algorithm is Newton Trust Region but it seems general to any auto diff dependent algorithm.

Solutions are identical when using NelderMead.

https://github.com/elbert5770/Nested-forward-diff-opt

See the files: Results NewtonTR diff OS processors.txt and Results NelderMead diff OS processors.txt

The difference between processors also shows up in line 121 of the program calling ForwardDiff.gradient(), but not in line 120 when calling FiniteDiff.finite_difference_gradient().

I realize that this might be expected due to processor differences and not an error in the code.

Results: I know these errors are small but they can add up.

Ubuntu/Intel 11 gen

 * Candidate solution
    Final objective value:     9.249637e-10

 * Found with
    Algorithm:     Newton's Method (Trust Region)

 * Convergence measures
    |x - x'|               = 0.00e+00 ≤ 0.0e+00
    |x - x'|/|x'|          = 0.00e+00 ≤ 0.0e+00
    |f(x) - f(x')|         = 0.00e+00 ≤ 0.0e+00
    |f(x) - f(x')|/|f(x')| = 0.00e+00 ≤ 0.0e+00
    |g(x)|                 = 3.85e-06 ≰ 1.0e-08

 * Work counters
    Seconds run:   0  (vs limit Inf)
    Iterations:    28
    f(x) calls:    29
    ∇f(x) calls:   29
    ∇²f(x) calls:  18

("Final answer for kf:", sol_kf.u, kf0) = ("Final answer for kf:", [9.999258179589958], [10.0])
("Final answer for u_ss:", sol_ss.u, u0_ss) = ("Final answer for u_ss:", [9.996162482738214, 39.997032718359605, 19.999999999999886], [10.0, 40.0, 20.0])
("Error in ODE solution with optimized values", sum((Ypred .- Ymeas) .^ 2)) = ("Error in ODE solution with optimized values", 9.249636677031537e-10)

Ubuntu/AMD

* Candidate solution
    Final objective value:     9.254577e-10

 * Found with
    Algorithm:     Newton's Method (Trust Region)

 * Convergence measures
    |x - x'|               = 0.00e+00 ≤ 0.0e+00
    |x - x'|/|x'|          = 0.00e+00 ≤ 0.0e+00
    |f(x) - f(x')|         = 0.00e+00 ≤ 0.0e+00
    |f(x) - f(x')|/|f(x')| = 0.00e+00 ≤ 0.0e+00
    |g(x)|                 = 4.20e-06 ≰ 1.0e-08

 * Work counters
    Seconds run:   0  (vs limit Inf)
    Iterations:    28
    f(x) calls:    29
    ∇f(x) calls:   29
    ∇²f(x) calls:  18

("Final answer for kf:", sol_kf.u, kf0) = ("Final answer for kf:", [9.999264846163442], [10.0])
("Final answer for u_ss:", sol_ss.u, u0_ss) = ("Final answer for u_ss:", [9.996180308310272, 39.99705938465377, 19.999999999999943], [10.0, 40.0, 20.0])
("Error in ODE solution with optimized values", sum((Ypred .- Ymeas) .^ 2)) = ("Error in ODE solution with optimized values", 9.254577306742998e-10)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant