Skip to content

titu1994/Keras-DualPathNetworks

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Dual Path Networks in Keras

Dual Path Networks are highly efficient networks which combine the strength of both ResNeXt Aggregated Residual Transformations for Deep Neural Networks and DenseNets Densely Connected Convolutional Networks.

Note: Weights have not been ported over yet.

Dual Path Connections

Usage

Several of the standard Dual Path Network models have been included. They can be initialized as :

from dual_path_network import DPN92, DPN98, DPN107, DPN137

model = DPN92(input_shape=(224, 224, 3)) # same for the others

To create a custom DualPathNetwork, use the DualPathNetwork builder method :

from dual_path_network import DualPathNetwork

model = DualPathNetwork(input_shape=(224, 224, 3),
                        initial_conv_filters=64,
                        depth=[3, 4, 20, 3],
                        filter_increment=[16, 32, 24, 128],
                        cardinality=32,
                        width=3,
                        weight_decay=0,
                        include_top=True,
                        weights=None,
                        input_tensor=None,
                        pooling=None,
                        classes=1000)

Performance

Support

  • Keras does not have inbuilt support for grouped convolutions. Therefore I had to use lambda layers to match the ResNeXt paper implementation. When grouped convolution support is added, I hope to add it in this as well.
  • Mean-Max Global Pooling support is present with the help of Lambda layer to scale the sum.
  • Depth and Filter_Increment must be lists for now, and must be lists of same length. Will think about adding support for integers, but I think list support is far more useful anyway, so I may not implement it.
  • Weight decay support is added, but disabled by default. The DPN paper does not mention it, but ResNet, WRN and ResNeXt paper may all use small weight regularization. Use a small value of 1e-4 or 5e-4 if you wish to use it.