-
Notifications
You must be signed in to change notification settings - Fork 316
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Binary classification problem using NeuralNetworkClassifier and cross entropy loss #798
Comments
Hi @robinshi123, I think some wires have been crossed here and I am gonna do my best to untangle them. Target labelsSo as you rightly said in tutorial 02, there is a model that outputs values in the range [-1, 1] - this is Cross Entropy LossYour confusion cross entropy loss is understandable, as there is some magic going on under the hood you need to know about. Cross entropy can technically be used with either
and you can just set it training. Keep in mind the reason you can just add
Here I have essentially created *needs is a bit strong because technically you can have 1 qubit for a 2 class problem which naturally isn't one hot encoded but I thought we'd leave that out. TLDRIf you want to guarantee cross_entropy works properly, make sure that:
|
What should we add?
Hello,
We're currently attempting a classification task using NeuralNetworkClassifier with the 'cross_entropy' loss function, as outlined in the tutorial found here: https://qiskit-community.github.io/qiskit-machine-learning/tutorials/02_neural_network_classifier_and_regressor.html.
Our target label has two classes. We noticed that the model outputs values in the range [-1, 1] instead of the typical [0, 1]. Consequently, we have adjusted the target class (ground truth) to also range between [-1, 1] to match the model's predictions.
However, we are unsure about one aspect: when using 'cross_entropy' as the loss function, is there an automatic adjustment within the model considering that binary cross entropy typically expects the outputs to be probabilities that sum to 1, with classes labeled as 0 and 1? The tutorial only uses the default loss which is "squared_error". Thus it is unclear to us if using NeuralNetworkClassifier with the 'cross_entropy' loss function, whether a special treatment is needed?
Thanks in advance!
The text was updated successfully, but these errors were encountered: