File size: 3,637 Bytes
55fa3a4 5d9f8b3 09bc485 01d8ab1 55fa3a4 5e2c89f c02cd50 698441f c02cd50 5e2c89f c02cd50 5d9f8b3 698441f c02cd50 698441f c02cd50 698441f 5e2c89f 698441f c02cd50 698441f c02cd50 698441f 75497e6 698441f 2cf8ca4 5e2c89f 2cf8ca4 698441f 2cf8ca4 698441f 2cf8ca4 698441f 2cf8ca4 5e2c89f 2cf8ca4 5e2c89f 2cf8ca4 698441f 55fa3a4 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 |
---
base_model:
- PhaseTechnologies/RoBERTo
tags:
- roberto
language:
- en
pipeline_tag: text-classification
library_name: transformers
---
# RoBERTo-Physics-v1-Finetuned
## Model Overview π
RoBERTo-Physics-v1-Finetuned is a state-of-the-art text classification model fine-tuned on physics-related corpora. Built upon the robust RoBERTa architecture, this model is designed to classify physics-related text into predefined categories with high accuracy and efficiency.
- **Model Name:** RoBERTo-Physics-v1-Finetuned
- **Developer:** [Phase Technologies](https://huggingface.co/PhaseTechnologies) π
- **Model Type:** Text Classification π
- **Base Model:** RoBERTa ποΈ
- **Intended Use:** Classifying physics-related texts π¬
## Model Details π οΈ
- **Pre-trained on:** RoBERTa
- **Fine-tuned on:** Physics Custom Dataset
- **Number of Parameters:** 125M
- **Training Framework:** PyTorch β‘
## Performance Metrics π
| Metric | Score |
| --------- | ----- |
| Accuracy | 85% |
| Precision | 0.82 |
| Recall | 0.88 |
| F1 Score | 0.85 |
## How to Use π‘
### Installation:
To use this model, install the required dependencies:
```bash
pip install transformers torch
```
### Loading the Model:
```python
from transformers import AutoModelForSequenceClassification, AutoTokenizer
model_name = "PhaseTechnologies/RoBERT-physics-v1-finetuned"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForSequenceClassification.from_pretrained(model_name)
```
### Demo :
For demo, visit [PhaseTechnologies/RoBERT-physics-v1-finetuned](https://colab.research.google.com/drive/1BRldXlVpnUufvC7NEi-_bf_ySeQPMbP_?usp=sharing)!
### Running Inference:
```python
!pip install transformers torch
!pip install datasets
from transformers import AutoModelForSequenceClassification, AutoTokenizer
import torch
model_name = "PhaseTechnologies/RoBERTo-physics-v1-finetuned"
# Load tokenizer and model
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForSequenceClassification.from_pretrained(model_name)
def predict(text):
inputs = tokenizer(text, return_tensors="pt") # Convert text to model input
with torch.no_grad(): # No need to calculate gradients
outputs = model(**inputs) # Pass input to model
return outputs.logits # Return raw predictions
# Example physics-related input
sample_text = "Newton's second law states that force equals mass times acceleration."
logits = predict(sample_text)
print(logits)
```
```python
from transformers import pipeline
# Load the model
classifier = pipeline("text-classification", model="PhaseTechnologies/RoBERTo-physics-v1-finetuned")
# Perform inference
text = "Newton's second law states that force equals mass times acceleration."
result = classifier(text)
print(result)
```
## Intended Use β
- Educational and academic research π
- Scientific text classification π¬
- Automated tagging in physics-related content βοΈ
## Limitations β οΈ
- Not suitable for general-purpose text classification β
- Performance may degrade with highly technical physics terminology π§
- Limited understanding of out-of-domain topics π
## Ethical Considerations π€
- The model should be used responsibly for educational and research purposes π
- Ensure it is not used to disseminate misinformation π«
## Acknowledgments π
This model is the final text classification release from Phase Technologies! π Thank you to all contributors and researchers who made this possible.
For more details, visit [Phase Technologies on Hugging Face](https://huggingface.co/PhaseTechnologies)! |