Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

output JSON file #1

Open
sleepykid opened this issue Nov 21, 2019 · 8 comments
Open

output JSON file #1

sleepykid opened this issue Nov 21, 2019 · 8 comments
Labels
enhancement New feature or request

Comments

@sleepykid
Copy link

sleepykid commented Nov 21, 2019

Quick question, I'm looking to try and get this to work with Python, but before I dig in was wondering if the output JSON for each backtest run are saved (eg. default eg. is bin\ParamerizedAlgorithm.json). And if so is there a naming convention it uses to help distinguish the parameter values. My goal was to run a batch of backtests and then analysis the result .json files (in juypter notebook) to help tune and refine my research. thanks for any help

@Doggie52
Copy link
Owner

Hi @sleepykid , there's currently no functionality in place to output the runs anywhere. The LEAN engine by default saves its runs down in the bin folder (in the Debug or Release subfolder) but you would not have a way of tying these back to the parameters.

What I have done locally is the following (bear in mind any code is C#):

  1. Inside my main algorithm (the one inheriting QCAlgorithm), rename my algorithm ID using SetAlgorithmId() to some random UUID string
  2. Write a result handler (inheriting BaseResultsHandler, IResultHandler) to write this new algorithm ID and all parameters to a common file, results.csv
  3. Run Lean-Batch-Launcher and once it's done, read the output from results.csv

Now that each algorithm is being renamed in a predictable fashion and I have results.csv which connects an ID to a set of parameters, I can observe and analyse my results.

Hope this helps. I'll leave this open as a feature request to add something similar to this project.

@Doggie52 Doggie52 added the enhancement New feature or request label Nov 22, 2019
@sleepykid
Copy link
Author

Thanks! once I write a result handler, where do I invoke it/how?

@Doggie52
Copy link
Owner

In the Run() method of Instance/Program.cs, maybe around

, you could add the following:

Config.Set( "backtesting.result-handler", "Namespace.Your.Result.Handler.Here" );

@sleepykid
Copy link
Author

great thanks for your help!
in the result handler I just copied the BacktestingResultHandler and changed
public void StoreResult(Packet packet, bool async = false)
var key = _job.BacktestId + ".json";
to
var key = Algorithm.AlgorithmId + ".json";

for some reason, _job.AlgorithmId
in
public virtual void Initialize(AlgorithmNodePacket job, IMessagingHandler messagingHandler, IApi api, ITransactionHandler transactionHandler)

was giving me the defaulted backtestId ("algorithm-type-name" from config file and not SetAlgorithmId() )

is using Algorithm.AlgorithmId the way you accessed it? (Just wondering if there was another way)

@Doggie52
Copy link
Owner

Doggie52 commented Dec 3, 2019

Great to see someone putting in the effort and helping themselves! 👍

The reason _job.AlgorithmId is giving you something other than what you expect is probably related to QuantConnect/Lean#3011.

My workaround for that (if I recall correctly) was to access it inside SendFinalResult(), where I was able to access it through _job.AlgorithmId. I'm not sure why this works and accessing it in StoreResult() doesn't, though...

I'm sorry my recollection isn't 100%, it's been a while since I played with it.

@kanatm287
Copy link

kanatm287 commented Feb 26, 2020

Could you please write a step by step instructions.

I am new to c#. I came from jvm fucntional world.

Thnx

@kanatm287
Copy link

Solved by adding.
File.WriteAllText("result.json", JsonConvert.SerializeObject(results, Formatting.Indented));

@Doggie52
Copy link
Owner

Doggie52 commented Mar 7, 2020

Glad to hear this!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants