Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Viterbi decode是否存在问题? #72

Open
tymanman opened this issue Dec 15, 2021 · 0 comments
Open

Viterbi decode是否存在问题? #72

tymanman opened this issue Dec 15, 2021 · 0 comments

Comments

@tymanman
Copy link

for feat in feats:
    next_tag_var = (
         forward_var.view(1, -1).expand(self.tagset_size, self.tagset_size)
          + self.transitions
     )
      _, bptrs_t = torch.max(next_tag_var, dim=1)
     viterbivars_t = next_tag_var[range(len(bptrs_t)), bptrs_t]
      forward_var = viterbivars_t + feat
      backscores.append(forward_var)
      backpointers.append(bptrs_t)

这里为什么在加入当前step的emit score之前就max了,难道不应该把当前step的emit score和转移概率一起加上再取max吗

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant