Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How can I feed batch of data to the model when predicting? #490

Open
zhimakaimenxa opened this issue Nov 30, 2020 · 1 comment
Open

How can I feed batch of data to the model when predicting? #490

zhimakaimenxa opened this issue Nov 30, 2020 · 1 comment

Comments

@zhimakaimenxa
Copy link

My code looks like follows,,

            var runner = session.GetRunner();
            //var tensor = Utils.ImageToTensorGrayScale(file);
            var tensor = dataTransfer.OutputTensor(symbol, beginIndex, endIndex);
            runner.AddInput(graph["input_1"][0], tensor);
            runner.Fetch(graph["dense_2/Softmax"][0]);

            var output = runner.Run();
            var vecResults = output[0].GetValue();

The batch size dimension of the tensor can only be 1, when I give tensor with multi samples, the result are same. How can I predict a batch of samples in a time without using loop?

Best Regards

zhimakaimenxa

@cesarsouza
Copy link
Contributor

While I kind of love this project and where it came from, I would highly suggest you to do this kind of processing in Python in either Python/TensorFlow or Python/Pytorch and simply consume their outputs in a C# application if you need to.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants