File size: 999 Bytes
8cfeacf
8e61c1f
8cfeacf
8e61c1f
8cfeacf
8e61c1f
8cfeacf
8e61c1f
8cfeacf
 
8e61c1f
 
8cfeacf
 
 
 
 
8e61c1f
 
8cfeacf
 
8e61c1f
8cfeacf
 
8e61c1f
8cfeacf
 
 
 
 
8e61c1f
8cfeacf
 
b0979cf
707288d
b0979cf
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
Dummy Model for Lab4

This model is a fine-tuned version of bert-base-uncased on SST-2 dataset. 

Results of the evaluation set:

Accuracy: 0.64

This model was fine-tuneded for personal research usage.
with  randomly selected 100 training datas and 100 evaluation datas from SST-2 dataset.


# Evaluation
import evaluate
predictions = trainer.predict(Resrt_eval)
print(predictions.predictions.shape, predictions.label_ids.shape)
preds = np.argmax(predictions.predictions, axis=-1)


metric = evaluate.load("glue", "sst2")
metric.compute(predictions=preds, references=predictions.label_ids)

Training hyperparameters
The following hyperparameters were used during training:

learning_rate: unset
train_batch_size: unset
eval_batch_size: unset
seed of training dataset: 49282927487
seed of evaluation dataset:492829487

lr_scheduler_type: linear
num_epochs: 3.0

Training results
![image/png](https://cdn-uploads.huggingface.co/production/uploads/65e812f328ebe0129dd9a2b4/mxsW8uXzCJrVbmamFFfqy.png)