File size: 2,702 Bytes
02f14f0
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
import torch
import torch.nn as nn
import os
from transformers import AutoModel


class Model(torch.nn.Module):

    def __init__(self, model_dir, dropout=0.2, hidden_dim=768):
        """
        Initialize the model.
        :param model_name: the name of the model
        :param metric_names: the names of the metrics to use
        :param dropout: the dropout rate
        :param hidden_dim: the hidden dimension of the model
        """
        super(Model, self).__init__()
        self.metric_names = ['Happiness', 'Sadness', 'Anger', 'Disgust', 'Fear', 'Pride', 'Valence', 'Arousal']
        self.bert = AutoModel.from_pretrained(model_dir)


        for name in self.metric_names:
            setattr(self, name, nn.Linear(hidden_dim, 1))
            setattr(self, 'l_1_' + name, nn.Linear(hidden_dim, hidden_dim))

        self.layer_norm = nn.LayerNorm(hidden_dim)
        self.relu = nn.ReLU()
        self.dropout = nn.Dropout(dropout)
        self.sigmoid = nn.Sigmoid()

    def forward(self, input_id, mask):
        """
        Forward pass of the model.
        :param args: the inputs
        :return: the outputs
        """
        _, x = self.bert(input_ids = input_id, attention_mask=mask, return_dict=False)
        output = self.rate_embedding(x)
        return output

    def rate_embedding(self, x):
        output_ratings = []
        for name in self.metric_names:
            first_layer = self.relu(self.dropout(self.layer_norm(getattr(self, 'l_1_' + name)(x) + x)))
            second_layer = self.sigmoid(getattr(self, name)(first_layer))
            output_ratings.append(second_layer)

        return output_ratings

    def save_pretrained(self, save_directory):
        self.bert.save_pretrained(save_directory)
        torch.save(self.state_dict(), f'{save_directory}/pytorch_model.bin')

    @classmethod
    def from_pretrained(cls, model_dir, dropout=0.2, hidden_dim=768):
        if not os.path.isdir(model_dir):
            raise ValueError(f"The provided model directory {model_dir} is not a valid directory.")

        model_path = os.path.join(model_dir, 'pytorch_model.bin')
        if not os.path.isfile(model_path):
            raise FileNotFoundError(f"The model file pytorch_model.bin was not found in the directory {model_dir}.")

        config_path = os.path.join(model_dir, 'config.json')
        if not os.path.isfile(config_path):
            raise FileNotFoundError(f"The configuration file config.json was not found in the directory {model_dir}.")

        model = cls(model_dir, dropout, hidden_dim)
        state_dict = torch.load(model_path, map_location=torch.device('cpu'))
        model.load_state_dict(state_dict)
        return model