add submodule

This commit is contained in:
lyuxiang.lx
2024-07-04 21:40:58 +08:00
parent 076829ab84
commit 3910efd6d3
6 changed files with 100 additions and 150 deletions

View File

@@ -152,7 +152,7 @@ class MultiHeadedAttention(nn.Module):
4.If the different position in decoder see different block
of the encoder, such as Mocha, the passed in mask could be
in (#batch, L, T) shape. But there is no such case in current
Wenet.
CosyVoice.
cache (torch.Tensor): Cache tensor (1, head, cache_t, d_k * 2),
where `cache_t == chunk_size * num_decoding_left_chunks`
and `head * d_k == size`