Skip to content

harmitsb2122/ToxiCheck

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

44 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ToxiCheck

A Tool to Detect Cyberbullying and Check the Toxicity of Comments on Various Websites

About

ToxiCheck is a Google Chrome Extension which primarily targets software developer websites such as Github , detects cyberbullying on them, and provides toxicity reports on comments. Along with this , Toxicheck also assists the user in avoiding the use of toxic language by suggesting gentler alternatives as they type.

Demo Video

Technical Details

Models used -

  1. Toxic-BERT [1]
  2. BART-base-detox [1] [2]

UI -

  1. Vanilla JS
  2. Chart JS

Flow Diagrams

Working of the Bullying Classifier (Text Blurring Mechanism)

image

Working of the Autosuggestor (Suggestion Mechanism)

image

Major Features

Toxicity Chart

image

Autosuggestor feature

image

Features for Github

Toxicity charts for Github

Screenshot 2023-04-30 011019

The Autosuggestor for Github

image

Instructions

  1. Clone the repository
  2. Enter the directory Toxicheck and type
$ npm install
  1. Edit the bearer token inside the app.js file with your huggingface (write enabled) token.
  2. Go to chrome browser and type
chrome://extensions/
  1. Click on Load Unpacked option and browse to the folder Toxicheck and select it.
  2. Enable/Reload the extension
  3. Navigate to the websites you wish

About

A chrome extension to give toxicity reports on various webistes such as gmail threads and github (issues,comments,pull request messages).

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages