File size: 2,573 Bytes
a2555d5
f86ad4c
2125156
9bdbb34
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
49d80d1
 
 
 
 
 
 
 
 
2125156
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
eaf1120
2125156
9bdbb34
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
---
pipeline_tag: text-classification
widget:
- text: >-
    NairiSoft is looking for a highly qualified person with deep knowledge and
    practical experience in Java programming. The selected candidate will be
    involved in all stages of the development life cycle.
  example_title: Current Position Requirments 1
- text: >-
    Ogma Applications is seeking motivated Senior Developers to work on its
    worldwide projects. The projects are web applications utilizing latest
    technologies in video webcasting over internet for web browsers, Televisions
    and telephone systems. In order to succeed in this team, the incumbent must
    have the passion and energy to work in an entrepreneurial, and fast paced
    environment. In addition, the Senior Software Engineer must be an
    experienced senior architect and technical leader with in-depth knowledge of
    software development processes. As a senior member of the team in Armenia,
    Senior Software Engineer will be working closely with other developers and
    peers in the US and other teams around the globe, to analyze, design,
    develop, test and deliver the best in class software.
  example_title: Current Position Requirments 2
- text: >-
    Armeconombank OJSC is looking for a .Net Developer to join its team. The
    Software Developer will take part in design and development projects.
  example_title: Current Position Requirments 3
language:
- en
tags:
- albert
- text-classification
- recommendation
- job
- albert-base-v2
- IT
---


This repository contains a Albert model designed for text classification. The architecture of the model is based on the Albert Base v2 model.

# Library 

```
pip install transformers
pip install sentencepiece
```

# Example
```python
from transformers import AutoModel,AutoTokenizer

tokenizer = AutoTokenizer.from_pretrained('Apizhai/Albert-IT-JobRecommendation', use_fast=False),
model = AutoModel.from_pretrained('Apizhai/Albert-IT-JobRecommendation')
```

#  Training hyperparameters 
The following hyperparameters were used during training:
- max_seq_length: 128
- max_length: 128
- train_batch_size: 4
- eval_batch_size: 32
- num_train_epochs: 10
- evaluate_during_training: False
- evaluate_during_training_steps: 100
- use_multiprocessing: False
- fp16: True
- save_steps: -1
- save_eval_checkpoints: False
- save_model_every_epoch: False
- no_cache: True
- reprocess_input_data: True
- overwrite_output_dir: True
- preprocess_inputs: False
- num_return_sequences: 1 

# Score
- f1-score: 0.85574	
- macro avg: 0.84748
- weighted avg: 0.81575