File size: 1,016 Bytes
8cfeacf
8e61c1f
8cfeacf
8e61c1f
8cfeacf
8e61c1f
8cfeacf
8e61c1f
8cfeacf
 
8e61c1f
 
8cfeacf
 
 
 
 
8e61c1f
 
8cfeacf
 
8e61c1f
8cfeacf
 
8e61c1f
8cfeacf
 
 
 
 
8e61c1f
8cfeacf
 
 
8e61c1f
8cfeacf
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
Dummy Model for Lab4

This model is a fine-tuned version of bert-base-uncased on SST-2 dataset. 

Results of the evaluation set:

Accuracy: 0.64

This model was fine-tuneded for personal research usage.
with  randomly selected 100 training datas and 100 evaluation datas from SST-2 dataset.


# Evaluation
import evaluate
predictions = trainer.predict(Resrt_eval)
print(predictions.predictions.shape, predictions.label_ids.shape)
preds = np.argmax(predictions.predictions, axis=-1)


metric = evaluate.load("glue", "sst2")
metric.compute(predictions=preds, references=predictions.label_ids)

Training hyperparameters
The following hyperparameters were used during training:

learning_rate: unset
train_batch_size: unset
eval_batch_size: unset
seed of training dataset: 49282927487
seed of evaluation dataset:492829487

lr_scheduler_type: linear
num_epochs: 3.0
Training results

Epoch	Training Loss	    Validation Loss	
1	    No log	 0.674658	0.480000
2	    No log	 0.640980	0.600000
3	    No log	 0.640266	0.640000