Phase-Technologies commited on
Commit
698441f
Β·
verified Β·
1 Parent(s): 3ef22fc

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +78 -17
README.md CHANGED
@@ -1,25 +1,86 @@
1
- ---
2
- library_name: transformers
3
- tags:
4
- - text-classification
5
- base_model: PhaseTechnologies/RoBERTo
6
- widget:
7
- - text: I love AutoTrain
8
- ---
9
 
10
- # Model Trained Using AutoTrain
11
 
12
- - Problem type: Text Classification
13
 
14
- ## Validation Metrics
15
- loss: 0.6174320578575134
 
 
 
16
 
17
- f1: 0.7039106145251397
18
 
19
- precision: 0.5625
 
 
 
20
 
21
- recall: 0.9402985074626866
22
 
23
- auc: 0.6388059701492537
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
24
 
25
- accuracy: 0.5826771653543307
 
1
+ # RoBERT-Physics-v1-Finetuned
 
 
 
 
 
 
 
2
 
3
+ ## Model Overview πŸš€
4
 
5
+ RoBERT-Physics-v1-Finetuned is a state-of-the-art text classification model fine-tuned on physics-related corpora. Built upon the robust RoBERTa architecture, this model is designed to classify physics-related text into predefined categories with high accuracy and efficiency.
6
 
7
+ - **Model Name:** RoBERT-Physics-v1-Finetuned
8
+ - **Developer:** [Phase Technologies](https://huggingface.co/PhaseTechnologies) 🌐
9
+ - **Model Type:** Text Classification πŸ“š
10
+ - **Base Model:** RoBERTa πŸ—οΈ
11
+ - **Intended Use:** Classifying physics-related texts πŸ”¬
12
 
13
+ ## Model Details πŸ› οΈ
14
 
15
+ - **Pre-trained on:** RoBERTa
16
+ - **Fine-tuned on:** Physics Text Corpus v1.0
17
+ - **Number of Parameters:** 125M
18
+ - **Training Framework:** PyTorch ⚑
19
 
20
+ ## Performance Metrics πŸ“Š
21
 
22
+ | Metric | Score |
23
+ | --------- | ----- |
24
+ | Accuracy | 85% |
25
+ | Precision | 0.82 |
26
+ | Recall | 0.88 |
27
+ | F1 Score | 0.85 |
28
+
29
+ ## How to Use πŸ’‘
30
+
31
+ ### Installation:
32
+
33
+ To use this model, install the required dependencies:
34
+
35
+ ```bash
36
+ pip install transformers torch
37
+ ```
38
+
39
+ ### Loading the Model:
40
+
41
+ ```python
42
+ from transformers import AutoModelForSequenceClassification, AutoTokenizer
43
+
44
+ model_name = "PhaseTechnologies/RoBERT-physics-v1-finetuned"
45
+ tokenizer = AutoTokenizer.from_pretrained(model_name)
46
+ model = AutoModelForSequenceClassification.from_pretrained(model_name)
47
+ ```
48
+
49
+ ### Running Inference:
50
+
51
+ ```python
52
+ def predict(text):
53
+ inputs = tokenizer(text, return_tensors="pt")
54
+ with torch.no_grad():
55
+ outputs = model(**inputs)
56
+ return outputs.logits
57
+
58
+ # Example Usage
59
+ sample_text = "Newton's second law states that force equals mass times acceleration."
60
+ logits = predict(sample_text)
61
+ print("Predicted Class:", torch.argmax(logits, dim=-1).item())
62
+ ```
63
+
64
+ ## Intended Use βœ…
65
+
66
+ - Educational and academic research πŸ“š
67
+ - Scientific text classification πŸ”¬
68
+ - Automated tagging in physics-related content βš›οΈ
69
+
70
+ ## Limitations ⚠️
71
+
72
+ - Not suitable for general-purpose text classification ❌
73
+ - Performance may degrade with highly technical physics terminology 🧐
74
+ - Limited understanding of out-of-domain topics 🌍
75
+
76
+ ## Ethical Considerations 🀝
77
+
78
+ - The model should be used responsibly for educational and research purposes πŸ“–
79
+ - Ensure it is not used to disseminate misinformation 🚫
80
+
81
+ ## Acknowledgments πŸ™Œ
82
+
83
+ This model is the final text classification release from Phase Technologies! πŸŽ‰ Thank you to all contributors and researchers who made this possible.
84
+
85
+ For more details, visit [Phase Technologies on Hugging Face](https://huggingface.co/PhaseTechnologies)!
86