|
--- |
|
license: apache-2.0 |
|
base_model: bert-base-uncased |
|
tags: |
|
- text-classification |
|
- bert |
|
- english |
|
model-index: |
|
- name: BERT Classification |
|
results: [] |
|
language: |
|
- en |
|
pipeline_tag: text-classification |
|
metrics: |
|
- accuracy |
|
--- |
|
|
|
# BERT Classification |
|
|
|
## Model Overview |
|
|
|
- **Model Name**: BERT Classification |
|
- **Model Type**: Text Classification |
|
- **Developer**: Mansoor Hamidzadeh |
|
- **Framework**: Transformers |
|
- **Language**: English |
|
- **License**: Apache-2.0 |
|
|
|
## Model Description |
|
|
|
This model is a fine-tuned BERT (Bidirectional Encoder Representations from Transformers) designed for text classification tasks. It categorizes text into four labels: |
|
|
|
- **Label 1**: Household |
|
- **Label 2**: Books |
|
- **Label 3**: Clothing & Accessories |
|
- **Label 4**: Electronics |
|
|
|
## Technical Details |
|
|
|
- **Model Size**: 109M parameters |
|
- **Tensor Type**: F32 |
|
- **File Format**: Safetensors |
|
|
|
## How To Use |
|
```python |
|
# Use a pipeline as a high-level helper |
|
from transformers import pipeline |
|
|
|
text='' |
|
pipe = pipeline("text-classification", model="mansoorhamidzadeh/bert_classification") |
|
pipe(text) |
|
|
|
``` |
|
## Usage |
|
|
|
The model is useful for categorizing product descriptions or similar text data into predefined labels. |
|
|
|
|
|
## Citation |
|
|
|
If you use this model in your research or applications, please cite it as follows: |
|
|
|
```bibtex |
|
@misc{mansoorhamidzadeh/bert_classification, |
|
author = {mansoorhamidzadeh}, |
|
title = {English to Persian Translation using MT5-Small}, |
|
year = {2024}, |
|
publisher = {Hugging Face}, |
|
howpublished = {\url{https://huggingface.co/mansoorhamidzadeh/bert_classification}}, |
|
} |
|
|