Skip to content

Endpoint latency of RNN transducer measurement #5433

Answered by b-flo
rxpwang asked this question in Q&A
Discussion options

You must be logged in to vote

Hi,

Sorry for the delay!

However, I've noticed that the RNN transducer inference process is not currently implemented in a streaming fashion within the espnet2/bin/asr_inference.py script.

Not sure which version you use but for streaming, you should look at asr_inference_streaming.py or asr_transducer_inference.py. There are two Transducer version in ESPnet, see tutorial doc.

if I intend to measure the endpoint latency of RNN transducer operations during streaming inference (where "endpoint latency" is defined as the time between the completion of speech and the entire inference process end), how can I accurately measure or estimate this value?

Hum, I'm not entirely sure we can use th…

Replies: 3 comments 2 replies

Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
2 replies
@rxpwang
Comment options

@b-flo
Comment options

Answer selected by rxpwang
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
RNNT (RNN) transducer related issue
3 participants