You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, I am using nn-meter to train a dwconv-bn-renlu predictor on a custom backend with the configuration in the image below. In the first round of training, the accuracy rate can reach about 90%, but in the next two rounds, the accuracy rate drops. Want to ask for leave Have you ever encountered this situation? And how to solve it?
The text was updated successfully, but these errors were encountered:
In addition, when the initial sampling point is set to 5000, I noticed that some of the collected samples are repeated. Do I need to do data preprocessing for this situation?
Hi, thanks for raising this issue! The objective of adaptive data sampling is to achieve a high level of accuracy in predicting latency. If satisfactory accuracy can be attained in the initial round, subsequent rounds become unnecessary.
Sorry, maybe I didn't express the question clearly. I make the following summary.
For dwconv-bn-relu, using the official parameters can only achieve 90% accuracy in the first round (while the paper is 97%), I think this is far from enough. Then in the next few rounds, the problem of accuracy loss occurs every round. For this case, is it necessary to readjust the hyperparameters of the random forest according to your own backend?
There are a large number of repeated points among the sampled data points. Is it necessary to deduplicate these repeated points?
Is it necessary to perform data preprocessing for the sampled data?
Another very important thing, is your office located in Zhongguancun, Beijing? Can I invite you to have a meal and discuss about nn-meter?
Hi, I am using nn-meter to train a dwconv-bn-renlu predictor on a custom backend with the configuration in the image below. In the first round of training, the accuracy rate can reach about 90%, but in the next two rounds, the accuracy rate drops. Want to ask for leave Have you ever encountered this situation? And how to solve it?
The text was updated successfully, but these errors were encountered: