Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Define a way to improve NER installation through adding new cases and retraining it #1307

Open
danielkornev opened this issue Aug 27, 2020 · 2 comments
Assignees

Comments

@danielkornev
Copy link
Member

danielkornev commented Aug 27, 2020

Want to contribute to DeepPavlov? Please read the contributing guideline first.

What problem are we trying to solve?:

In production systems, it is important to have a simple and non-obtrusive way to extend capabilities on NER to recognize new objects or fix recognition of the existing ones

How can we solve it?:

Web interface for adding new cases

Are there other issues that block this solution?:


@acriptis
Copy link
Contributor

acriptis commented Sep 2, 2020

I suppose a good NER Service should cover the following use cases:

  1. Annotate requests with NER entites (as any NER tool must do) via REST API
  2. Allow Admins to view log of requests for annotation and produced results
  3. Allow Admins to add new training cases into the dataset (which will fine-tune NER model instantly or via button).

It seems to be a quite simple web app.

@Rajmehta123
Copy link

Rajmehta123 commented Nov 13, 2020

I agree. The 3rd point seems to be interesting. But, we cant expect the model to fine-tune in real-time. Do you mean to transfer learning? Transfer the weights of the new data to the existing trained model?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants