Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG]: method error when used in the debugger #218

Open
wenpw opened this issue Jun 2, 2023 · 20 comments
Open

[BUG]: method error when used in the debugger #218

wenpw opened this issue Jun 2, 2023 · 20 comments
Assignees
Labels
bug Something isn't working

Comments

@wenpw
Copy link

wenpw commented Jun 2, 2023

@MilesCranmer Thank you very much for your code which are very helpful.

I am trying to debug the code using the test_fast_cycle.jl.
There is no problem when I directly run the code.

However, if I use debug mode in Vscode and Julia extensions to check some middle variables, the following bug happens:

[EvaluateEquation.jl]

        elseif tree.r.degree == 0
            (cumulator_l, complete) = _eval_tree_array(tree.l, cX, operators, Val(turbo))
            @return_on_false complete cumulator_l
            @return_on_nonfinite_array cumulator_l
            # op(x, y), where y is a constant or variable but x is not.
            return deg2_r0_eval(tree, cumulator_l, cX, op, Val(turbo))

The Bug is

Exception has occurred: MethodError
MethodError: Cannot convert an object of type Nothing to an object of type Symbol
Closest candidates are:
convert(::Type{T}, !Matched::T) where T at Base.jl:61
Symbol(::Any...) at strings/basic.jl:229

Stacktrace:
[1] _eval_tree_array(tree::Node{Float32}, cX::Matrix{Float32}, operators::DynamicExpressions.OperatorEnumModule.OperatorEnum, #unused#::Val{false})

@ DynamicExpressions.EvaluateEquationModule C:\Users\Administrator.julia\packages\DynamicExpressions\YQrb6\src\EvaluateEquation.jl:123

And even for the example code in DynamicExpressions, using debug mode, the above similar error happens.


using DynamicExpressions

operators = OperatorEnum(; binary_operators=[+, -, *], unary_operators=[cos])

x1 = Node(; feature=1)
x2 = Node(; feature=2)

expression = x1 * cos(x2 - 3.2)

X = randn(Float64, 2, 100);
expression(X, operators) # 100-element Vector{Float64}

Thank you in advance.

Best regards

Version

0.18.0

Operating System

Windows

Interface

Other (specify below)

Relevant log output

Exception has occurred: MethodError
MethodError: Cannot `convert` an object of type Nothing to an object of type Symbol
Closest candidates are:
  convert(::Type{T}, !Matched::T) where T at Base.jl:61
  Symbol(::Any...) at strings/basic.jl:229

Stacktrace:
  [1] _eval_tree_array(tree::Node{Float32}, cX::Matrix{Float32}, operators::DynamicExpressions.OperatorEnumModule.OperatorEnum, #unused#::Val{false})
    @ DynamicExpressions.EvaluateEquationModule C:\Users\Administrator\.julia\packages\DynamicExpressions\YQrb6\src\EvaluateEquation.jl:123
  [2] eval_tree_array(tree::Node{Float32}, cX::Matrix{Float32}, operators::DynamicExpressions.OperatorEnumModule.OperatorEnum; turbo::Bool)
    @ DynamicExpressions.EvaluateEquationModule C:\Users\Administrator\.julia\packages\DynamicExpressions\YQrb6\src\EvaluateEquation.jl:65
  [3] (::DynamicExpressions.EvaluateEquationModule.var"#eval_tree_array##kw")(::NamedTuple{(:turbo,), Tuple{Bool}}, ::typeof(eval_tree_array), tree::Node{Float32}, cX::Matrix{Float32}, operators::DynamicExpressions.OperatorEnumModule.OperatorEnum)
    @ DynamicExpressions.EvaluateEquationModule C:\Users\Administrator\.julia\packages\DynamicExpressions\YQrb6\src\EvaluateEquation.jl:59
  [4] eval_tree_array(tree::Node{Float32}, X::Matrix{Float32}, options::Options{Int64, Optim.Options{Float64, Nothing}, L2DistLoss, Nothing, StatsBase.Weights{Float64, Float64, Vector{Float64}}}; kws::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
    @ SymbolicRegression.InterfaceDynamicExpressionsModule C:\Users\Administrator\.julia\packages\SymbolicRegression\5L9HL\src\InterfaceDynamicExpressions.jl:51
  [5] eval_tree_array(tree::Node{Float32}, X::Matrix{Float32}, options::Options{Int64, Optim.Options{Float64, Nothing}, L2DistLoss, Nothing, StatsBase.Weights{Float64, Float64, Vector{Float64}}})
    @ SymbolicRegression.InterfaceDynamicExpressionsModule C:\Users\Administrator\.julia\packages\SymbolicRegression\5L9HL\src\InterfaceDynamicExpressions.jl:50
  [6] _eval_loss(tree::Node{Float32}, dataset::Dataset{Float32, Float32, Matrix{Float32}, Vector{Float32}, Nothing, NamedTuple{(), Tuple{}}}, options::Options{Int64, Optim.Options{Float64, Nothing}, L2DistLoss, Nothing, StatsBase.Weights{Float64, Float64, Vector{Float64}}})
    @ SymbolicRegression.LossFunctionsModule C:\Users\Administrator\.julia\packages\SymbolicRegression\5L9HL\src\LossFunctions.jl:66
  [7] eval_loss(tree::Node{Float32}, dataset::Dataset{Float32, Float32, Matrix{Float32}, Vector{Float32}, Nothing, NamedTuple{(), Tuple{}}}, options::Options{Int64, Optim.Options{Float64, Nothing}, L2DistLoss, Nothing, StatsBase.Weights{Float64, Float64, Vector{Float64}}})
    @ SymbolicRegression.LossFunctionsModule C:\Users\Administrator\.julia\packages\SymbolicRegression\5L9HL\src\LossFunctions.jl:95
  [8] score_func(dataset::Dataset{Float32, Float32, Matrix{Float32}, Vector{Float32}, Nothing, NamedTuple{(), Tuple{}}}, member::Node{Float32}, options::Options{Int64, Optim.Options{Float64, Nothing}, L2DistLoss, Nothing, StatsBase.Weights{Float64, Float64, Vector{Float64}}}, complexity::Int64)
    @ SymbolicRegression.LossFunctionsModule C:\Users\Administrator\.julia\packages\SymbolicRegression\5L9HL\src\LossFunctions.jl:136
  [9] PopMember(dataset::Dataset{Float32, Float32, Matrix{Float32}, Vector{Float32}, Nothing, NamedTuple{(), Tuple{}}}, t::Node{Float32}, options::Options{Int64, Optim.Options{Float64, Nothing}, L2DistLoss, Nothing, StatsBase.Weights{Float64, Float64, Vector{Float64}}}, complexity::Nothing; ref::Int64, parent::Int64, deterministic::Bool)
    @ SymbolicRegression.PopMemberModule C:\Users\Administrator\.julia\packages\SymbolicRegression\5L9HL\src\PopMember.jl:99
 [10] (::Core.var"#Type##kw")(::NamedTuple{(:parent, :deterministic), Tuple{Int64, Bool}}, ::Type{PopMember}, dataset::Dataset{Float32, Float32, Matrix{Float32}, Vector{Float32}, Nothing, NamedTuple{(), Tuple{}}}, t::Node{Float32}, options::Options{Int64, Optim.Options{Float64, Nothing}, L2DistLoss, Nothing, StatsBase.Weights{Float64, Float64, Vector{Float64}}}, complexity::Nothing)
    @ SymbolicRegression.PopMemberModule C:\Users\Administrator\.julia\packages\SymbolicRegression\5L9HL\src\PopMember.jl:88
 [11] (::Core.var"#Type##kw")(::NamedTuple{(:parent, :deterministic), Tuple{Int64, Bool}}, ::Type{PopMember}, dataset::Dataset{Float32, Float32, Matrix{Float32}, Vector{Float32}, Nothing, NamedTuple{(), Tuple{}}}, t::Node{Float32}, options::Options{Int64, Optim.Options{Float64, Nothing}, L2DistLoss, Nothing, StatsBase.Weights{Float64, Float64, Vector{Float64}}})
    @ SymbolicRegression.PopMemberModule C:\Users\Administrator\.julia\packages\SymbolicRegression\5L9HL\src\PopMember.jl:88
 [12] (::SymbolicRegression.PopulationModule.var"#2#3"{Float32, Int64, Options{Int64, Optim.Options{Float64, Nothing}, L2DistLoss, Nothing, StatsBase.Weights{Float64, Float64, Vector{Float64}}}, Int64, Dataset{Float32, Float32, Matrix{Float32}, Vector{Float32}, Nothing, NamedTuple{(), Tuple{}}}})(i::Int64)
    @ SymbolicRegression.PopulationModule none:0
 [13] iterate(g::Base.Generator{UnitRange{Int64}, SymbolicRegression.PopulationModule.var"#2#3"{Float32, Int64, Options{Int64, Optim.Options{Float64, Nothing}, L2DistLoss, Nothing, StatsBase.Weights{Float64, Float64, Vector{Float64}}}, Int64, Dataset{Float32, Float32, Matrix{Float32}, Vector{Float32}, Nothing, NamedTuple{(), Tuple{}}}}}, s::Tuple{})
    @ Base generator.jl:47
 [14] collect(itr::Base.Generator{UnitRange{Int64}, SymbolicRegression.PopulationModule.var"#2#3"{Float32, Int64, Options{Int64, Optim.Options{Float64, Nothing}, L2DistLoss, Nothing, StatsBase.Weights{Float64, Float64, Vector{Float64}}}, Int64, Dataset{Float32, Float32, Matrix{Float32}, Vector{Float32}, Nothing, NamedTuple{(), Tuple{}}}}})
    @ Base array.jl:787
 [15] Population(dataset::Dataset{Float32, Float32, Matrix{Float32}, Vector{Float32}, Nothing, NamedTuple{(), Tuple{}}}; npop::Int64, nlength::Int64, options::Options{Int64, Optim.Options{Float64, Nothing}, L2DistLoss, Nothing, StatsBase.Weights{Float64, Float64, Vector{Float64}}}, nfeatures::Int64)
    @ SymbolicRegression.PopulationModule C:\Users\Administrator\.julia\packages\SymbolicRegression\5L9HL\src\Population.jl:40
 [16] (::Core.var"#Type##kw")(::NamedTuple{(:npop, :options, :nfeatures), Tuple{Int64, Options{Int64, Optim.Options{Float64, Nothing}, L2DistLoss, Nothing, StatsBase.Weights{Float64, Float64, Vector{Float64}}}, Int64}}, ::Type{Population}, dataset::Dataset{Float32, Float32, Matrix{Float32}, Vector{Float32}, Nothing, NamedTuple{(), Tuple{}}})
    @ SymbolicRegression.PopulationModule C:\Users\Administrator\.julia\packages\SymbolicRegression\5L9HL\src\Population.jl:37
 [17] (::SymbolicRegression.SearchUtilsModule.var"#6#8"{Int64, Vector{Dataset{Float32, Float32, Matrix{Float32}, Vector{Float32}, Nothing, NamedTuple{(), Tuple{}}}}, Options{Int64, Optim.Options{Float64, Nothing}, L2DistLoss, Nothing, StatsBase.Weights{Float64, Float64, Vector{Float64}}}})(i::Int64)
    @ SymbolicRegression.SearchUtilsModule none:0
 [18] iterate(g::Base.Generator{UnitRange{Int64}, SymbolicRegression.SearchUtilsModule.var"#6#8"{Int64, Vector{Dataset{Float32, Float32, Matrix{Float32}, Vector{Float32}, Nothing, NamedTuple{(), Tuple{}}}}, Options{Int64, Optim.Options{Float64, Nothing}, L2DistLoss, Nothing, StatsBase.Weights{Float64, Float64, Vector{Float64}}}}}, s::Tuple{})
    @ Base generator.jl:47
 [19] collect(itr::Base.Generator{UnitRange{Int64}, SymbolicRegression.SearchUtilsModule.var"#6#8"{Int64, Vector{Dataset{Float32, Float32, Matrix{Float32}, Vector{Float32}, Nothing, NamedTuple{(), Tuple{}}}}, Options{Int64, Optim.Options{Float64, Nothing}, L2DistLoss, Nothing, StatsBase.Weights{Float64, Float64, Vector{Float64}}}}})
    @ Base array.jl:787
 [20] (::SymbolicRegression.SearchUtilsModule.var"#5#7"{Int64, Vector{Dataset{Float32, Float32, Matrix{Float32}, Vector{Float32}, Nothing, NamedTuple{(), Tuple{}}}}, Options{Int64, Optim.Options{Float64, Nothing}, L2DistLoss, Nothing, StatsBase.Weights{Float64, Float64, Vector{Float64}}}})(j::Int64)
    @ SymbolicRegression.SearchUtilsModule none:0
 [21] iterate(g::Base.Generator{UnitRange{Int64}, SymbolicRegression.SearchUtilsModule.var"#5#7"{Int64, Vector{Dataset{Float32, Float32, Matrix{Float32}, Vector{Float32}, Nothing, NamedTuple{(), Tuple{}}}}, Options{Int64, Optim.Options{Float64, Nothing}, L2DistLoss, Nothing, StatsBase.Weights{Float64, Float64, Vector{Float64}}}}}, s::Tuple{})
    @ Base generator.jl:47
 [22] collect(itr::Base.Generator{UnitRange{Int64}, SymbolicRegression.SearchUtilsModule.var"#5#7"{Int64, Vector{Dataset{Float32, Float32, Matrix{Float32}, Vector{Float32}, Nothing, NamedTuple{(), Tuple{}}}}, Options{Int64, Optim.Options{Float64, Nothing}, L2DistLoss, Nothing, StatsBase.Weights{Float64, Float64, Vector{Float64}}}}})
    @ Base array.jl:787
 [23] init_dummy_pops(nout::Int64, npops::Int64, datasets::Vector{Dataset{Float32, Float32, Matrix{Float32}, Vector{Float32}, Nothing, NamedTuple{(), Tuple{}}}}, options::Options{Int64, Optim.Options{Float64, Nothing}, L2DistLoss, Nothing, StatsBase.Weights{Float64, Float64, Vector{Float64}}})
    @ SymbolicRegression.SearchUtilsModule C:\Users\Administrator\.julia\packages\SymbolicRegression\5L9HL\src\SearchUtils.jl:50
 [24] _EquationSearch(parallelism::Symbol, datasets::Vector{Dataset{Float32, Float32, Matrix{Float32}, Vector{Float32}, Nothing, NamedTuple{(), Tuple{}}}}; niterations::Int64, options::Options{Int64, Optim.Options{Float64, Nothing}, L2DistLoss, Nothing, StatsBase.Weights{Float64, Float64, Vector{Float64}}}, numprocs::Nothing, procs::Nothing, addprocs_function::Nothing, runtests::Bool, saved_state::Nothing)
    @ SymbolicRegression C:\Users\Administrator\.julia\packages\SymbolicRegression\5L9HL\src\SymbolicRegression.jl:493
 [25] (::SymbolicRegression.var"#_EquationSearch##kw")(::NamedTuple{(:niterations, :options, :numprocs, :procs, :addprocs_function, :runtests, :saved_state), Tuple{Int64, Options{Int64, Optim.Options{Float64, Nothing}, L2DistLoss, Nothing, StatsBase.Weights{Float64, Float64, Vector{Float64}}}, Nothing, Nothing, Nothing, Bool, Nothing}}, ::typeof(SymbolicRegression._EquationSearch), parallelism::Symbol, datasets::Vector{Dataset{Float32, Float32, Matrix{Float32}, Vector{Float32}, Nothing, NamedTuple{(), Tuple{}}}})
    @ SymbolicRegression C:\Users\Administrator\.julia\packages\SymbolicRegression\5L9HL\src\SymbolicRegression.jl:411
 [26] EquationSearch(datasets::Vector{Dataset{Float32, Float32, Matrix{Float32}, Vector{Float32}, Nothing, NamedTuple{(), Tuple{}}}}; niterations::Int64, options::Options{Int64, Optim.Options{Float64, Nothing}, L2DistLoss, Nothing, StatsBase.Weights{Float64, Float64, Vector{Float64}}}, parallelism::Symbol, numprocs::Nothing, procs::Nothing, addprocs_function::Nothing, runtests::Bool, saved_state::Nothing)
    @ SymbolicRegression C:\Users\Administrator\.julia\packages\SymbolicRegression\5L9HL\src\SymbolicRegression.jl:398
 [27] (::SymbolicRegression.var"#EquationSearch##kw")(::NamedTuple{(:niterations, :options, :parallelism, :numprocs, :procs, :addprocs_function, :runtests, :saved_state), Tuple{Int64, Options{Int64, Optim.Options{Float64, Nothing}, L2DistLoss, Nothing, StatsBase.Weights{Float64, Float64, Vector{Float64}}}, Symbol, Nothing, Nothing, Nothing, Bool, Nothing}}, ::typeof(EquationSearch), datasets::Vector{Dataset{Float32, Float32, Matrix{Float32}, Vector{Float32}, Nothing, NamedTuple{(), Tuple{}}}})
    @ SymbolicRegression C:\Users\Administrator\.julia\packages\SymbolicRegression\5L9HL\src\SymbolicRegression.jl:363
 [28] EquationSearch(X::Matrix{Float32}, y::Matrix{Float32}; niterations::Int64, weights::Nothing, varMap::Vector{String}, options::Options{Int64, Optim.Options{Float64, Nothing}, L2DistLoss, Nothing, StatsBase.Weights{Float64, Float64, Vector{Float64}}}, parallelism::Symbol, numprocs::Nothing, procs::Nothing, addprocs_function::Nothing, runtests::Bool, saved_state::Nothing, multithreaded::Nothing, loss_type::Type{Nothing})
    @ SymbolicRegression C:\Users\Administrator\.julia\packages\SymbolicRegression\5L9HL\src\SymbolicRegression.jl:331
 [29] (::SymbolicRegression.var"#EquationSearch##kw")(::NamedTuple{(:varMap, :niterations, :options), Tuple{Vector{String}, Int64, Options{Int64, Optim.Options{Float64, Nothing}, L2DistLoss, Nothing, StatsBase.Weights{Float64, Float64, Vector{Float64}}}}}, ::typeof(EquationSearch), X::Matrix{Float32}, y::Matrix{Float32})
    @ SymbolicRegression C:\Users\Administrator\.julia\packages\SymbolicRegression\5L9HL\src\SymbolicRegression.jl:291
 [30] EquationSearch(X::Matrix{Float32}, y::Vector{Float32}; kw::Base.Pairs{Symbol, Any, Tuple{Symbol, Symbol, Symbol}, NamedTuple{(:varMap, :niterations, :options), Tuple{Vector{String}, Int64, Options{Int64, Optim.Options{Float64, Nothing}, L2DistLoss, Nothing, StatsBase.Weights{Float64, Float64, Vector{Float64}}}}}})
    @ SymbolicRegression C:\Users\Administrator\.julia\packages\SymbolicRegression\5L9HL\src\SymbolicRegression.jl:356
 [31] (::SymbolicRegression.var"#EquationSearch##kw")(::NamedTuple{(:varMap, :niterations, :options), Tuple{Vector{String}, Int64, Options{Int64, Optim.Options{Float64, Nothing}, L2DistLoss, Nothing, StatsBase.Weights{Float64, Float64, Vector{Float64}}}}}, ::typeof(EquationSearch), X::Matrix{Float32}, y::Vector{Float32})
    @ SymbolicRegression C:\Users\Administrator\.julia\packages\SymbolicRegression\5L9HL\src\SymbolicRegression.jl:353
 [32] top-level scope
    @ j:\dsdyp\ai\PySR-master\SymbolicRegression.jl-master\test\test_fast_cycle.jl:19

Extra Info

No response

@wenpw wenpw added the bug Something isn't working label Jun 2, 2023
@MilesCranmer
Copy link
Owner

Do you know what debug mode is changing in the code? (Does it remove those error checks?) Or is there some way I could reproduce the bug?

@MilesCranmer
Copy link
Owner

It seems like it might be a bug in the VSCode debugger. There are no symbol types in the evaluation code, so that error must be from the debugger trying to print some diagnostics. Maybe post this on the Julia vscode GitHub issues as well?

@wenpw
Copy link
Author

wenpw commented Jun 2, 2023

Do you know what debug mode is changing in the code? (Does it remove those error checks?) Or is there some way I could reproduce the bug?

Thank you for the reply.

Using your DynamicExpression.jl code, you could debug the following example code (https://github.com/SymbolicML/DynamicExpressions.jl) in Linux or Windows using vscode + julia extension by pressing F5, you could reproduce similar errors:

using DynamicExpressions

operators = OperatorEnum(; binary_operators=[+, -, *], unary_operators=[cos])

x1 = Node(; feature=1)
x2 = Node(; feature=2)

expression = x1 * cos(x2 - 3.2)

X = randn(Float64, 2, 100);
expression(X, operators) # 100-element Vector{Float64}

@wenpw
Copy link
Author

wenpw commented Jun 2, 2023

It seems like it might be a bug in the VSCode debugger. There are no symbol types in the evaluation code, so that error must be from the debugger trying to print some diagnostics. Maybe post this on the Julia vscode GitHub issues as well?

May I ask what debug mode or IDE do you use in writing these codes? I could make a comparion between vscode and others, and pin down the problems.

@MilesCranmer
Copy link
Owner

I use vscode as well but I haven’t seen this issue before. What is your Julia version?

@wenpw
Copy link
Author

wenpw commented Jun 2, 2023

I use vscode as well but I haven’t seen this issue before. What is your Julia version?

It is quite strange. I used julia of Version 1.9.0 (2023-05-07).

@MilesCranmer
Copy link
Owner

Oh, wait, I was just able to reproduce it. I think I just haven't ran the code in the Julia debugger before. This is very strange. What variable is the Symbol and what is the Nothing?

@wenpw
Copy link
Author

wenpw commented Jun 3, 2023

Oh, wait, I was just able to reproduce it. I think I just haven't ran the code in the Julia debugger before. This is very strange. What variable is the Symbol and what is the Nothing?

It is great that the error could be reproduced. It's indeed very strange.

@MilesCranmer MilesCranmer changed the title [BUG]: test_fast_cycle.jl : method error [BUG]: method error when used in the debugger Jun 13, 2023
@zzccchen
Copy link

Dear MilesCranmer and wenpw. I encountered the same problem when using VScode Debugger or Debugger.jl [REPL] to debug code. How do you usually debug code? My development environment is Ubuntu22.04, Julia 1.9.4. Have you solved this problem?

@MilesCranmer
Copy link
Owner

Unfortunately I don't know what the issue is. Not sure if @wenpw has solved it either?

I usually use VSCode but I tend to not use the Julia debugger. However I think it's important that others can use it so I'm interested to solve this.

It seems unrelated to SymbolicRegression.jl though. If you post an issue on https://github.com/julia-vscode/julia-vscode and/or https://github.com/JuliaDebug/Debugger.jl they might know what it is from? I'm happy to help lobby there for support.

@wenpw
Copy link
Author

wenpw commented Nov 22, 2023

@MilesCranmer @zzccchen Sorry that it is still not solved. And I have tested several versions of SymbolicRegression.jl, they all have the same debug error.

@zzccchen
Copy link

Sorry to hear this, thank you for your replies @MilesCranmer @wenpw

@MilesCranmer
Copy link
Owner

To help narrow it down, maybe see if the same issue shows up when using DynamicExpressions.jl alone?

@wenpw
Copy link
Author

wenpw commented Nov 23, 2023

@MilesCranmer Yes, the same issue shows up when using DynamicExpressions.jl alone.

You could try to debug the following test_tree_constrcution.jl at vscode.

https://github.com/SymbolicML/DynamicExpressions.jl/blob/master/test/test_tree_construction.jl

The same error will appear.

@MilesCranmer
Copy link
Owner

I wonder if it has to do with the Node type itself? The Node type has a field which can either be a number or a nothing... https://github.com/SymbolicML/DynamicExpressions.jl/blob/46388518281b0be12479afcb3a3b8bdabc361ccd/src/Equation.jl#L57

@wenpw
Copy link
Author

wenpw commented Nov 23, 2023

I wonder why debug mode and directly run the code has so large difference?
You may also try to debug the following code. Another error will come up :
https://github.com/SymbolicML/DynamicExpressions.jl/blob/master/test/test_custom_operators.jl

@MilesCranmer
Copy link
Owner

Hm...

Can you try to keep reducing it? So that it is the minimal code possible that still gives an error?

@wenpw
Copy link
Author

wenpw commented Nov 23, 2023

I could try but I am not sure if I could find the minimal code that still gives an error.

@MilesCranmer
Copy link
Owner

Hey @wenpw,

Have you tried https://github.com/JuliaDebug/Infiltrator.jl? It seems to be a much more robust debugger for Julia than the built-in one in VSCode. I've been trying it and it works quite well for debugging SymbolicRegression.jl!

Cheers,
Miles

@wenpw
Copy link
Author

wenpw commented Dec 25, 2023

@MilesCranmer Thank you very much. This would be of great help.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants