Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

No dropout in last hidden layer? #9

Open
j6e opened this issue Nov 23, 2017 · 1 comment
Open

No dropout in last hidden layer? #9

j6e opened this issue Nov 23, 2017 · 1 comment

Comments

@j6e
Copy link

j6e commented Nov 23, 2017

I've been working with your code lately and I've notice that the last layer of the keras_mlp.py in both models does never apply dropout:

model = Sequential()
model.add( Dense( params['layer_1_size'], init = params['init'], 
activation = params['layer_1_activation'], input_dim = input_dim ))

for i in range( int( params['n_layers'] ) - 1 ):
	
	extras = 'layer_{}_extras'.format( i + 1 )
	
	if params[extras]['name'] == 'dropout':
		model.add( Dropout( params[extras]['rate'] ))
	elif params[extras]['name'] == 'batchnorm':
		model.add( BatchNorm())
		
	model.add( Dense( params['layer_{}_size'.format( i + 2 )], init = params['init'], 
		activation = params['layer_{}_activation'.format( i + 2 )]))
	   
model.add( Dense( 1, init = params['init'], activation = 'linear' ))

As can be seen in the code, the last hidden layer can't have dropout since the dropout is coded before the layer itself. Is this intentional or it's undesired behaviour?

@zygmuntz
Copy link
Owner

It is probably a bug, yeah. Keras docs show dropout applied before the last layer:
https://keras.io/getting-started/sequential-model-guide/

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants