You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The loss values are so high and not coming down over time.
Problem details
We are trying to create a racing environment and use reinforcement learning to train a model to do racing. So we started from this example. We wanted to test how much time it needs to train a model and how fat it can reach.
I used the same parameters in the example. Except following one
max_epoch_runtime_sec = 30
Also didn't change the code.
I attached the output file from one agent. Please help me to troubleshoot what the issue is.
Experiment/Environment details
Used existing weights to start with.
Started training on Azure with 6 NV6 machines. 5 agents and the trainer.
While running the job I restarted the agents after some time. (After 12h)
Then run the training for another 20h agent1.txt
The text was updated successfully, but these errors were encountered:
kalum84
changed the title
Loss value is so high and not coming down
DistributedRL training - Loss value is so high and not coming down
Jan 23, 2019
Problem description
The loss values are so high and not coming down over time.
Problem details
We are trying to create a racing environment and use reinforcement learning to train a model to do racing. So we started from this example. We wanted to test how much time it needs to train a model and how fat it can reach.
I used the same parameters in the example. Except following one
Also didn't change the code.
I attached the output file from one agent. Please help me to troubleshoot what the issue is.
Experiment/Environment details
Used existing weights to start with.
Started training on Azure with 6 NV6 machines. 5 agents and the trainer.
While running the job I restarted the agents after some time. (After 12h)
Then run the training for another 20h
agent1.txt
The text was updated successfully, but these errors were encountered: