Skip to content

jonasvdd/TDNN

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Fast TDNN layer implementation

This is an alternative implementation of the TDNN layer, proposed by Waibel et al. [1]. The main difference compared to other implementations is that it exploits the Pytorch Conv1d dilatation argument, making it multitudes faster than other popular implementations such as SiddGururani's PyTorch-TDNN.

Usage

# Create a TDNN layer 
layer_context = [-2, 0, 2]
input_n_feat = previous_layer_n_feat 
tddn_layer = TDNN(context=layer_context, input_channels=input_n_feat, output_channels=512, full_context=False)

# Run a forward pass; batch.size = [BATCH_SIZE, INPUT_CHANNELS, SEQUENCE_LENGTH]
out = tdnn_layer(batch)

References

[1] A. Waibel, T. Hanazawa, G. Hinton, and K. Shikano, “Phoneme Recognition Using Time-Delay Neural Networks,”, 1989

About

PyTorch implementation of a Time Delay Neural Network (TDNN)

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages