# Distilroberta for toxic comment detection | |
See my GitHub repo [toxic-comment-server](https://github.com/jpcorb20/toxic-comment-server) | |
The model was trained from [DistilRoberta](https://huggingface.co/distilroberta-base) on [Kaggle Toxic Comments](https://www.kaggle.com/c/jigsaw-toxic-comment-classification-challenge) with the BCEWithLogits loss for Multi-Label prediction. Thus, please use the sigmoid activation on the logits (not made to use the softmax output, e.g. like the HF widget). |