Skip to content
This repository has been archived by the owner on Jan 15, 2024. It is now read-only.

[website] Add a demo page for the Website #1180

Open
wants to merge 14 commits into
base: v0.x
Choose a base branch
from
Open

Conversation

chenw23
Copy link
Member

@chenw23 chenw23 commented Mar 3, 2020

Description

According to Issue (#688), a demo page is wanted in the GluonNLP website. So I am working on creating a demo web page for it. This is pull request will be continuously updated until the website is acceptable to be released.

Checklist

Essentials

  • PR's title starts with a category (e.g. [BUGFIX], [MODEL], [TUTORIAL], [FEATURE], [DOC], etc)
  • Changes are complete (i.e. I finished coding on this PR)
  • All changes have test coverage
  • Code is well-documented

Changes

  • Feature1, tests, (and when applicable, API doc)
  • Feature2, tests, (and when applicable, API doc)

Comments

  • If this change is a backward incompatible change, why must this change be made.
  • Interesting edge cases to note here

cc @dmlc/gluon-nlp-team

@chenw23 chenw23 requested a review from a team as a code owner March 3, 2020 08:02
@codecov
Copy link

codecov bot commented Mar 3, 2020

Codecov Report

Merging #1180 into master will not change coverage.
The diff coverage is n/a.

Impacted file tree graph

@@           Coverage Diff           @@
##           master    #1180   +/-   ##
=======================================
  Coverage   87.42%   87.42%           
=======================================
  Files          81       81           
  Lines        7346     7346           
=======================================
  Hits         6422     6422           
  Misses        924      924           

@mli
Copy link
Member

mli commented Mar 4, 2020

Job PR-1180/4 is complete.
Docs are uploaded to http://gluon-nlp-staging.s3-accelerate.dualstack.amazonaws.com/PR-1180/4/index.html

@leezu
Copy link
Contributor

leezu commented Mar 5, 2020


code | 503
-- | --
type | "InternalServerException"
message | "Prediction failed"

docs/demo.rst Outdated Show resolved Hide resolved
docs/demo.rst Outdated Show resolved Hide resolved
docs/demo.rst Outdated Show resolved Hide resolved
@mli
Copy link
Member

mli commented Mar 8, 2020

Job PR-1180/5 is complete.
Docs are uploaded to http://gluon-nlp-staging.s3-accelerate.dualstack.amazonaws.com/PR-1180/5/index.html

Introduction
------------

Sentiment Analysis predicts whether an input is positive or negative. The model is based on BERT base, and are trained on the binary classification setting of the Stanford Sentiment Treebank. It achieves about 87% and 93.4% accuracy on the test set.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

sorry, it should be "It achieves about 93.4% accuracy on the dev set. "

@mli
Copy link
Member

mli commented Mar 8, 2020

Job PR-1180/9 is complete.
Docs are uploaded to http://gluon-nlp-staging.s3-accelerate.dualstack.amazonaws.com/PR-1180/9/index.html

@leezu
Copy link
Contributor

leezu commented Mar 9, 2020


code | 503
-- | --
type | "InternalServerException"
message | "Prediction failed"

This may have been due to invalid input?

Currently the server just times out when submitting ["test"] as a query.

@chenw23
Copy link
Member Author

chenw23 commented Mar 10, 2020


code | 503
-- | --
type | "InternalServerException"
message | "Prediction failed"

This may have been due to invalid input?

Currently the server just times out when submitting ["test"] as a query.

I think it is due to invalid input. You may try out the default example I am now providing in the placeholder - it should work. Currently on the server side, I followed the same model as you provided in the #1140 .
I think the input arguments follow the same requirements you defined in that model. Maybe in the future we can provide more intuitive API or give more hints about which rules to follow when submitting inquiries.

@leezu
Copy link
Contributor

leezu commented Mar 10, 2020

You may try out the default example I am now providing in the placeholder - it should work.

Still times out for me. How long do you wait for a response?

@chenw23
Copy link
Member Author

chenw23 commented Mar 10, 2020

You may try out the default example I am now providing in the placeholder - it should work.

Still times out for me. How long do you wait for a response?

Are you on Amazon Corporate VPN? VPN blocks 8888 port, so it will not work. I can get response within one second.

@leezu
Copy link
Contributor

leezu commented Mar 11, 2020

Yes, this was due to VPN.

@mli
Copy link
Member

mli commented Mar 12, 2020

Job PR-1180/10 is complete.
Docs are uploaded to http://gluon-nlp-staging.s3-accelerate.dualstack.amazonaws.com/PR-1180/10/index.html

@chenw23
Copy link
Member Author

chenw23 commented Mar 12, 2020

Please help review whether the UI looks fine.
The JavaScript interaction of these controls are under developing and testing.

@mli
Copy link
Member

mli commented Mar 12, 2020

Job PR-1180/12 is complete.
Docs are uploaded to http://gluon-nlp-staging.s3-accelerate.dualstack.amazonaws.com/PR-1180/12/index.html

@mli
Copy link
Member

mli commented Mar 17, 2020

Job PR-1180/13 is complete.
Docs are uploaded to http://gluon-nlp-staging.s3-accelerate.dualstack.amazonaws.com/PR-1180/13/index.html

@chenw23
Copy link
Member Author

chenw23 commented Mar 17, 2020

[Help wanted] Currently I want to use an ajax to send the form information to the server and get the response so that I can display the information on the same page without refreshing the page. But I am stopped by the Cross Origin policy of ajax. It seems that modifying the server end is necessary for handling this problem? But the server running MMS is not a typical PHP/Java EE server, so I am not familiar about how to modify its response header.

I am not sure I am seeking for the best strategy for solving this problem...


Please input the following into the text box:

["Positive sentiment", "Negative sentiment"]
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should just show one sentence here. Without quotes. Without brackets.

So that the result can be displayed directly from the same page
@mli
Copy link
Member

mli commented Mar 18, 2020

Job PR-1180/14 is complete.
Docs are uploaded to http://gluon-nlp-staging.s3-accelerate.dualstack.amazonaws.com/PR-1180/14/index.html

@mli
Copy link
Member

mli commented Mar 18, 2020

Job PR-1180/15 is complete.
Docs are uploaded to http://gluon-nlp-staging.s3-accelerate.dualstack.amazonaws.com/PR-1180/15/index.html

@chenw23
Copy link
Member Author

chenw23 commented Apr 3, 2020

I tried the demo and did not see any response. Did it work for you?

Working now!

@eric-haibin-lin @leezu

@mli
Copy link
Member

mli commented Apr 17, 2020

Job PR-1180/21 is complete.
Docs are uploaded to http://gluon-nlp-staging.s3-accelerate.dualstack.amazonaws.com/PR-1180/21/index.html

@chenw23
Copy link
Member Author

chenw23 commented Apr 18, 2020

There has been constant failure of CI on cpu-unittest.
Would you please have a look at it? Thanks!
Replay 20
Replay 19
Replay 18
Replay 17
Replay 16
Replay 15
Replay 14
@leezu

@mli
Copy link
Member

mli commented Apr 24, 2020

Job PR-1180/22 is complete.
Docs are uploaded to http://gluon-nlp-staging.s3-accelerate.dualstack.amazonaws.com/PR-1180/22/index.html

@chenw23
Copy link
Member Author

chenw23 commented Apr 24, 2020

This pull request is ready for review and merge. Thanks!
@eric-haibin-lin

Demo
----

You can either input a sentence into the textbox or select one sample from the select control.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Currently we're giving an existence proof of "it's possible to have some web application to do sentiment analysis". But to be really useful to the end users, we need to teach them how to build it (not only show it's possible).

To make things simple, such instructions should be based on https://aws.amazon.com/serverless/ and https://aws.amazon.com/sagemaker/

@sxjscience
Copy link
Member

In fact, can we just provide a Colab/SageMaker Notebook?

@szha
Copy link
Member

szha commented May 19, 2020

We should definitely provide colab/sagemaker links for all tutorials. Demo is different in that it show cases the model for certain task without any involvement of code. Tutorial for setting up serving/inference is indeed appealing to end users. We should definitely pursue that after this PR by summarizing the steps.

@sxjscience
Copy link
Member

I think we can also provide demos with Colab + SageMaker, e.g., in https://magenta.tensorflow.org/demos/colab/. So we won't need to write any HTML codes (or just a few lines of HTML).

@leezu
Copy link
Contributor

leezu commented May 19, 2020

@sxjscience @szha I'm not sure just having a demo without providing any instructions on how to reproduce it (and how to deploy it) is very helpful without the demo meeting a very high bar. For a simple demo such as currently, we can make it special by showcasing how easy it is to deploy it on AWS.

@chenw23
Copy link
Member Author

chenw23 commented May 20, 2020

I am planning to add more descriptions on configuration and then add a CoLab. Do you think it is OK?

@szha
Copy link
Member

szha commented May 22, 2020

@leezu users who just want to try our pre-trained models will benefit from the set up. These are people who don't concern themselves with understanding the code yet, so mixing the concern would create noise for this need.

@leezu
Copy link
Contributor

leezu commented May 22, 2020

@szha it doesn't need to be mixed. The proposal is to have an impressive demo, and provide information how to reproduce it showing the benefits of the technical stack. Not everyone has to read the latter part.

@szha
Copy link
Member

szha commented May 23, 2020

Sounds good, @leezu. Do you have suggestions on how to improve the demo? Considering the investment, maybe the colab as @sxjscience suggested would indeed be the most efficient option.

Please continue the discussion and conclude the path forward.

@chenw23
Copy link
Member Author

chenw23 commented May 23, 2020

But actually our gpu-doc CI build is in faults. @leezu Maybe you can have a look at what is preventing our gpu-doc from being built? You may reference some latest builds like #1228
So that I can make changes to the pages in this pull request and we can view the results.

@chenw23
Copy link
Member Author

chenw23 commented Jun 18, 2020

Since colab needs someone's Google account, should I use my personal Google account to create colab? And should we create some repo/directory specifically for colab? Or just give a static file in someone's Google Drive?

@sxjscience
Copy link
Member

sxjscience commented Jun 18, 2020

@StrayBird-ATSH I think we can do the same thing as d2l: automatically/manully uploading the notebooks to another repo called https://github.com/d2l-ai/d2l-en-colab. In order to access specific notebooks, we can add the https://colab.research.google.com/github prefix to the link, like this:
https://colab.research.google.com/github/d2l-ai/d2l-en-colab/blob/master/chapter_preliminaries/ndarray.ipynb. @szha @eric-haibin-lin What do you think?

@chenw23
Copy link
Member Author

chenw23 commented Jun 19, 2020

That sounds great! Do you have any ideas about where to place this repo? Or just give a folder in the current repo?

@chenw23
Copy link
Member Author

chenw23 commented Jun 21, 2020

That sounds great! Do you have any ideas about where to place this repo? Or just give a folder in the current repo?

@eric-haibin-lin @szha

@sxjscience
Copy link
Member

@szha Would you create a repo in DMLC?

@szha
Copy link
Member

szha commented Jun 22, 2020

@sxjscience @StrayBird-ATSH I created https://github.com/dmlc/gluon-nlp-notebooks

How do we automate the deployment to that repo?

@sxjscience
Copy link
Member

@szha I'm currently not sure about the details. Need to as the D2L people about the existing solution.

@szha
Copy link
Member

szha commented Jun 22, 2020

Hi @astonzhang, how does the notebook deploy to colab work in d2l?

@chenw23
Copy link
Member Author

chenw23 commented Jun 29, 2020

@sxjscience @StrayBird-ATSH I created https://github.com/dmlc/gluon-nlp-notebooks

How do we automate the deployment to that repo?

@szha
Currently this repo is not initialized. Would you mind first initialize it (I think just adding a Readme will be enough) so that we can start contribute to this repo?

Thanks!

@szha
Copy link
Member

szha commented Jun 29, 2020

@StrayBird-ATSH done

@chenw23
Copy link
Member Author

chenw23 commented Jun 29, 2020

Wonderful

@chenw23
Copy link
Member Author

chenw23 commented Jul 11, 2020

@leezu Since I am currently using your trained model in this website and we are planning to create a colab. Maybe you can provide the python scripts that are equivalent to your trained models so that users can run the scripts and get the result in the colab?

@szha szha changed the base branch from master to v0.x August 13, 2020 02:16
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

6 participants