Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

can't run minimizer without feeding inputs even though I have output and label vectors. #348

Open
Corallus-Caninus opened this issue Mar 2, 2022 · 2 comments

Comments

@Corallus-Caninus
Copy link
Contributor

Corallus-Caninus commented Mar 2, 2022

I am trying to backprop/minimize my network without the input vector since I already have output and label vectors

This is the relevant code I'm trying to implement:

...
        //backprop
        let mut run_args = SessionRunArgs::new();
        run_args.add_target(&self.minimize);

        let error_squared_fetch = run_args.request_fetch(&self.Error, 0);
        // set output feed manually
        //TODO: runtime says we need to feed input
        run_args.add_feed(&self.Output_op, 0, &output);
        run_args.add_feed(&self.Label, 0, &labels);
        self.session.run(&mut run_args)?;

        let res: Tensor<f32> = run_args.fetch(error_squared_fetch)?;
...

where Output_op and Label are my output and label operations respectively and output and labels are my output and label tensors.
self.minimize is either GradientDescent optimizer or Adadelta optimizer. Error operation is defined as a function of output and label exclusively. The network is very similar to the xor example in this repository and is from my NormNet repo (its very messy and initial so beware).

Based on my understanding of backprop this should be possible. Is this feature missing or did I make a mistake? Please let me know how I can clarify this further.

stdout log from runtime:

thread 'tests::test_evaluate' panicked at 'called Result::unwrap() on an Err value: {inner:0x2147c7c9480, InvalidArgument: You must feed a value for placeholder tensor 'input' with dtype float and shape [1,2]
[[{{node input}}]]}', src\lib.rs:1197:94
note: run with RUST_BACKTRACE=1 environment variable to display a backtrace
test tests::test_evaluate ... FAILED

@Corallus-Caninus
Copy link
Contributor Author

I am unable to "backfeed" the output_op. It seems it is overridden by the input op placeholder since the network forward propagates anyways, after feeding to the Output_op operation. I believe I have to rework my graph, but this seems like it should be possible. please advise on the order of operations in the graph (pun unintended). From what I have gathered feeds are set in the graph but overriden if mutated during session.run() calls.

@Corallus-Caninus
Copy link
Contributor Author

this may just be me not knowing what to do for the Tensorflow graph and not specific to Tensorflow-rs

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant