Edit model card

zayuki/computer_generated_fake_review_detection

This model is a fine-tuned version of distilbert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Train Loss: 0.0116
  • Validation Loss: 0.0731
  • Train Accuracy: 0.9780
  • Train F1: 0.9781
  • Epoch: 2

Model description

This model was empowered by fine-tuned version of Distilbert and trained on Amazon Review Dataset, comprising of computer-generated fake reviews and genuine Amazon reviews. The fake reviews were generated by GPT-2, an AI text algoritm.

This model was trained to detect computer-generated fake reviews which generated by AI text algorithm.

Training hyperparameters

The following hyperparameters were used during training:

  • optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': True, 'is_legacy_optimizer': False, 'learning_rate': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 2e-05, 'decay_steps': 7580, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}
  • training_precision: float32

Training results

Train Loss Validation Loss Train Accuracy Train F1 Epoch
0.1207 0.0677 0.9723 0.9726 0
0.0343 0.0736 0.9753 0.9756 1
0.0116 0.0731 0.9780 0.9781 2

Framework versions

  • Transformers 4.33.1
  • TensorFlow 2.12.0
  • Datasets 2.14.5
  • Tokenizers 0.13.3
Downloads last month
24
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for zayuki/computer_generated_fake_review_detection

Finetuned
(6640)
this model