Skip to content

ckshitij/Comment-Toxicity-Analysis

Repository files navigation

Comment-Toxicity-Analysis

Index

  • Dataset
  • Initial Stage
    • About Files
    • Fault
  • Final Engine
  • How to Configure Final Engine
  • Improvement Scope
  • Helpful Refrenceses

Dataset

Initial Stage

  • This Initial Model is just for Comparing or for starting Without Preprocessing and Ensembling techniques.
  • It tells toxicity after Analysing the Comment.
    • If a comment is Toxic or Bad then it will return Toxic else it will return toxic.

img

About Files

  • ToxityAnalysis.ipynb -> For building model
  • finalized_model.pkl -> Saved ML model
  • RunModel.py -> For Reloading model and analysing Comment
  • app.js -> For Running nodejs app

Fault

  • It take more time because it reload python script evertime on pressing Submitting the Comment.

Final Engine

How to Configure Final Engine

  • First Download the code directly from here in zip or clone through git.

  • Then go to the engine folder.

  • Click On the above Engine Notebook Kaggle and download all the .pkl file in Models folder.

  • Then to to the Terminal and run app.py file.

  • Then open your Web Browser and type http://localhost:5000/.

    git clone https://github.com/ckshitij/Comment-Toxicity-Analysis.git
    cd Comment-Toxicity-Analysis
    cd engine
    python3 app.py
    

Improvement Scope

  • You Can improve this Code by Increasing the n-gram range (1,3or4) for Word-Level and n-gram range (1,10) for Char-Level.
  • You can use Deeplearning RNN (LSTM).
  • You can use word level Cnn Model too.

Helpful Refrenceses

About

Online Comment Toxicity Analysis using averaging the Classifiers and used both char level as well as word level n-grams.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published