Skip to content

Reimplementing SOTA convolution variants with Tensorflow 2.0.

Notifications You must be signed in to change notification settings

JinLi711/Convolution_Variants

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

30 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Convolution Variants

This repository replicates various convolution layers from SOTA papers.

This repository currently includes:

Attention Augmented Convolution Layer

AA Convolution Diagram

For other implementations in:

Notes

  • This implementation does not yet include relative positional encodings.

Mixed Depthwise Convolution Layer

Mix Conv Diagram

For other implementations in:

Notes

  • This implementation combines depthwise convolution with pointwise convolution. The original implementation only used depthwise convolutions.

Drop Block

Drop Block

For other implementations in:

Efficient Channel Attention Layer

ECA

For other implementations in:

Convolutional Block Attention Module Layer

CBAM

For other implementations in:

Usage

Here is an example of how to use one of the layers:

import tensorflow as tf
from convVariants import AAConv

aaConv = AAConv(
    channels_out=32,
    kernel_size=3,
    depth_k=8, 
    depth_v=8, 
    num_heads=4)

The layer can be treated like any other tf.keras.layers class.

model = tf.keras.models.Sequential([
    aaConv,
    tf.keras.layers.Flatten(),
    tf.keras.layers.Dense(128, activation='relu'),
    tf.keras.layers.Dropout(0.2),
    tf.keras.layers.Dense(10, activation='softmax')
    ])

model.compile(
    optimizer='adam',
    loss='sparse_categorical_crossentropy',
    metrics=['accuracy'])

model.fit(x_train, y_train, epochs=5)

Tests

Test cases are located here.

To run tests:

cd Convolution_Variants
python tests.py

Requirements

  • tensorflow 2.0.0 with GPU

Caveats

  • These layers are only tested to work for input format: NCHW.

Acknowledgements

Links to the original papers:

About

Reimplementing SOTA convolution variants with Tensorflow 2.0.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages