Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

how to concat multi layers? #138

Open
BackT0TheFuture opened this issue Mar 10, 2019 · 3 comments
Open

how to concat multi layers? #138

BackT0TheFuture opened this issue Mar 10, 2019 · 3 comments

Comments

@BackT0TheFuture
Copy link

Hi,
It's great work you made.
I came across one problem, python code looks like below

flatten = mx.symbol.Flatten(data=relu3)
fc1 = mx.symbol.FullyConnected(data=flatten, num_hidden=512)
fc21 = mx.symbol.FullyConnected(data=fc1, num_hidden=11)
fc22 = mx.symbol.FullyConnected(data=fc1, num_hidden=11)
fc23 = mx.symbol.FullyConnected(data=fc1, num_hidden=11)
fc24 = mx.symbol.FullyConnected(data=fc1, num_hidden=11)
fc2 = mx.symbol.Concat(*[fc21, fc22, fc23, fc24], dim=0)
label = mx.symbol.transpose(data=label)
label = mx.symbol.Reshape(data=label, target_shape=(0, ))
mx.symbol.SoftmaxOutput(data=fc2, label=label, name="softmax")  

Is it possible to implement this using ConvNetSharp?
thanks!

@cbovar
Copy link
Owner

cbovar commented Mar 12, 2019

Hi,

It should be possible using the 'Flow' part of ConvNetSharp (by creating a computation graph).
I will try to implement your example to ConvNetSharp soon and will post it here.

@cbovar
Copy link
Owner

cbovar commented Mar 12, 2019

It would look something like:

var cns = new ConvNetSharp<double>();

var input = cns.PlaceHolder("input");
var flatten = cns.Flatten(input);
var fc1 = cns.Dense(flatten, 512);
var fc21 = cns.Dense(fc1, 11);
var fc22 = cns.Dense(fc1, 11);
var fc23 = cns.Dense(fc1, 11);
var fc24 = cns.Dense(fc1, 11);
var fc2 = cns.Concat(cns.Concat(fc21, fc22), cns.Concat(fc23, fc24));
var model = cns.Softmax(fc2);

var label = cns.PlaceHolder("label");
var cost = cns.CrossEntropyLoss(model, label);

@BackT0TheFuture
Copy link
Author

Cool , thank you so much.
I'll have a try and give some feedback again.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants