This repository contains a simple Flask API for detecting NSFW images using this NSFW detection model.
Linked is our guide for setting up all Vrooli repos. No extra steps are required.
- Memory: 2GB
- CPUs: 1
Send a POST request to http://localhost:<PORT_NSFW>
if testing locally, or https://<your-domain>
if testing on a Virtual Private Server (VPS). The request must contain the following structure:
{
"key": "<your-api-key>", // Only if API_KEY environment variable is set
"images": [{
"buffer": "<base64-encoded-image>",
"hash": "<unique-hash-for-image>" // If provided, used to prevent computing for the image if it is passed again
}]
}
As a curl request, it looks like this:
curl -X POST \
-H "Content-Type: application/json" \
-H "key: <your_api_key>" \
-d '{"images": [{"buffer": "<base64-encoded-image>", "hash": "<unique-hash-for-image>"}]}' \
http://localhost:<PORT_NSFW>
If you are authenticated, it will return an object of this shape:
{
"predictions": [
"<unique-hash-for-image>": {
"drawings": 0.12345678,
"hentai": 2.12121212,
"neutral": 69.42000000,
"porn": 1.88888888,
"sexy": 8.76543210,
}
]
}
The model is taken from the NSFW detection model's release page. All you need to do is:
- Download the mobilenet*.zip from the latest release
- Unzip it
- Copy the
.h5
for the model (not the weights!) to this repo'smodels
folder
Contributions are always welcome! If you have suggestions for improvements, please create an issue or a pull request💖