SentenceTransformer based on vinai/phobert-base
This is a sentence-transformers model finetuned from vinai/phobert-base. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
Model Details
Model Description
- Model Type: Sentence Transformer
- Base model: vinai/phobert-base
- Maximum Sequence Length: 128 tokens
- Output Dimensionality: 768 dimensions
- Similarity Function: Cosine Similarity
Model Sources
- Documentation: Sentence Transformers Documentation
- Repository: Sentence Transformers on GitHub
- Hugging Face: Sentence Transformers on Hugging Face
Full Model Architecture
SentenceTransformer(
(0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: RobertaModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)
Usage
Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("trongvox/Phobert-Sentence-2")
# Run inference
sentences = [
'Noi tieng ve do lau doi va huong vi mon an nay o Ha Noi thi phai ke den hang Banh Duc Nong Thanh Tung. Banh o day hap dan o do deo dai cua bot, thit nam du day va nem nem vua mieng. Khi phuc vu, mon an nong sot toa ra mui huong thom lung tu bot, hanh phi, nuoc mam. Mon banh duc o day duoc chan ngap nuoc mam pha loang vi ngot, hoi man man, co thit bam voi nam meo va rat nhieu hanh kho da phi vang.Mon banh duc o Banh Duc Nong Thanh Tung duoc chan ngap nuoc mam pha loang vi ngot, hoi man man, co thit bam voi nam meo va rat nhieu hanh kho da phi vang. Cach an nay hoi giong voi mon banh gio chan nuoc mam thit bam o quan pho chua Lang Son gan cho Ban Co. La mon qua an nhe nhang, vua du lung lung bung, co ve dan da nen rat nhieu nguoi them them, nho nho. Banh duc nong Ha Noi o day khong bi pha them bot dau xanh nen van giu nguyen duoc huong vi dac trung. Dac biet, phan nhan con duoc tron them mot it cu dau xao tren ngon lua lon nen giu duoc do ngot gion.THONG TIN LIEN HE:Dia chi: 112 Truong Dinh, Quan Hai Ba Trung, Ha NoiGio mo cua: 10:00 - 21:00Dia diem chat luong: 4.7/5 (14 danh gia tren Google)\n Chi duong Danh gia Google',
'Banh Duc',
'Banh bi do',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
Training Details
Training Dataset
Unnamed Dataset
- Size: 11,347 training samples
- Columns:
sentence_0
andsentence_1
- Approximate statistics based on the first 1000 samples:
sentence_0 sentence_1 type string string details - min: 70 tokens
- mean: 127.61 tokens
- max: 128 tokens
- min: 3 tokens
- mean: 7.9 tokens
- max: 20 tokens
- Samples:
sentence_0 sentence_1 Nhung cu ca rot tuoi ngon duoc tam uop mot lop gia vi chua chua, ngot ngot va dem nuong chung voi toi thom lung tao nen huong vi hap dan den kho long cuong lai, vi ngot tu nhien kich thich vi giac cua nguoi thuong thuc lam day. Ban co the lam mon ca rot nuong toi nay de an cung thit nuong hay dung lam mon an kem trong bua an rat tuyet nha.Cach che bien: Ban chi can mo lo nuong o 190 do C truoc 10 phut. Trong dau tron deu 1 muong dau olive, 2 muong bo va 2 muong giam Balsamic. Ca rot cat bo phan la xanh, giu nguyen vo, rua that sach, cat lam doi. Cho ca rot vao khay nuong, xep cho deu. Toi lot vo, bao mong. Sau do ruoi hon hop dau olive da chuan bi len ca rot. Sau do cho toi bao mong len cung voi ngo tay, muoi va tieu, tron deu len. Cho khay ca rot vao lo nuong 30 phut la ca rot chin. Lay ra dia va thuong thuc.
Ca rot nuong
Banh chung Bo Dau la mot trong nhung mon ngon noi tieng nhat cua Thai Nguyen. Lang banh chung Bo Dau thuoc xa Co Lung, huyen Phu Luong duoc coi la noi luu giu mon banh mang tinh hoa am thuc Viet. "Banh chung luoc nuoc gieng than, thom ngon mui vi co phan troi cho", co le cau ca dao nay da tu lau tro thanh niem tu hao cua nguoi dan noi day - mot trong 5 lang lam banh chung noi tieng nhat mien Bac.
Banh chung Bo Dau phai duoc lam tu gao nep nuong thom ngon Dinh Hoa, thit lon sach cua nguoi dan toc va la dong rung duoc hai tai Na Ry, Bac Kan. Voi ban tay kheo leo day dan kinh nghiem lanh nghe cho ra nhung chiec banh dep mat. Co le vi the ma huong vi banh chung Bo Dau khong the tron lan voi cac loai khac. Do la thu dac san quanh nam khong chi dip Tet moi co, da keo chan biet bao du khach tu moi mien den thuong thuc. Huong vi cua troi dat, thien nhien va con nguoi giao hoa, hoa quyen va duoc ket tinh thanh thuc qua dac san noi tieng cua manh dat Thai Nguyen - banh chung Bo Dau.Banh chung Bo Dau
Mi Ramen la mot trong nhung mon an ngon nuc tieng ma nguoi Nhat rat ua chuong va tu hao. Tham chi, nguoi Nhat da mo han mot bao tang mi Ramen voi rat nhieu nhung hien vat trung bay ve lich su ra doi, phat trien cua mon an nay. Phan mi cua Ramen thuong duoc lam tu lua mi, muoi va kansui, co mau vang sam rat hap dan. Linh hon cua mon mi Ramen chac han la phan nuoc dung chu yeu duoc ham tu xuong heo hoac xuong ga trong it nhat 10 tieng tao nen vi ngon ngot, dam da. Va khi thuong thuc, ban se an kem voi thit heo thai lat mong, rong bien, trung, cha ca Nhat, ngo va bap cai de huong vi tro nen hoan hao nhat. Vay con chan chu gi ma khong ghe ngay Nha Hang Tho Tuyet de co ngay mon mi ngon kho cuong nay nao!
Nha Hang Tho Tuyet da tro thanh moi ruot cua nhieu thuc khach boi gia rat phai chang, menu khong co qua nhieu mon nhu may cho khac nhung hau nhu thu mon nao cung ngon. Mon Ramen Tho Tuyet Special ngon tuyet voi chac chan ban khong the bo lo. Trong do, an tuong nhat co le chinh la phan nuoc ...Nha Hang Tho Tuyet
- Loss:
MultipleNegativesRankingLoss
with these parameters:{ "scale": 20.0, "similarity_fct": "cos_sim" }
Training Hyperparameters
Non-Default Hyperparameters
per_device_train_batch_size
: 16per_device_eval_batch_size
: 16multi_dataset_batch_sampler
: round_robin
All Hyperparameters
Click to expand
overwrite_output_dir
: Falsedo_predict
: Falseeval_strategy
: noprediction_loss_only
: Trueper_device_train_batch_size
: 16per_device_eval_batch_size
: 16per_gpu_train_batch_size
: Noneper_gpu_eval_batch_size
: Nonegradient_accumulation_steps
: 1eval_accumulation_steps
: Nonetorch_empty_cache_steps
: Nonelearning_rate
: 5e-05weight_decay
: 0.0adam_beta1
: 0.9adam_beta2
: 0.999adam_epsilon
: 1e-08max_grad_norm
: 1num_train_epochs
: 3max_steps
: -1lr_scheduler_type
: linearlr_scheduler_kwargs
: {}warmup_ratio
: 0.0warmup_steps
: 0log_level
: passivelog_level_replica
: warninglog_on_each_node
: Truelogging_nan_inf_filter
: Truesave_safetensors
: Truesave_on_each_node
: Falsesave_only_model
: Falserestore_callback_states_from_checkpoint
: Falseno_cuda
: Falseuse_cpu
: Falseuse_mps_device
: Falseseed
: 42data_seed
: Nonejit_mode_eval
: Falseuse_ipex
: Falsebf16
: Falsefp16
: Falsefp16_opt_level
: O1half_precision_backend
: autobf16_full_eval
: Falsefp16_full_eval
: Falsetf32
: Nonelocal_rank
: 0ddp_backend
: Nonetpu_num_cores
: Nonetpu_metrics_debug
: Falsedebug
: []dataloader_drop_last
: Falsedataloader_num_workers
: 0dataloader_prefetch_factor
: Nonepast_index
: -1disable_tqdm
: Falseremove_unused_columns
: Truelabel_names
: Noneload_best_model_at_end
: Falseignore_data_skip
: Falsefsdp
: []fsdp_min_num_params
: 0fsdp_config
: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}fsdp_transformer_layer_cls_to_wrap
: Noneaccelerator_config
: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}deepspeed
: Nonelabel_smoothing_factor
: 0.0optim
: adamw_torchoptim_args
: Noneadafactor
: Falsegroup_by_length
: Falselength_column_name
: lengthddp_find_unused_parameters
: Noneddp_bucket_cap_mb
: Noneddp_broadcast_buffers
: Falsedataloader_pin_memory
: Truedataloader_persistent_workers
: Falseskip_memory_metrics
: Trueuse_legacy_prediction_loop
: Falsepush_to_hub
: Falseresume_from_checkpoint
: Nonehub_model_id
: Nonehub_strategy
: every_savehub_private_repo
: Nonehub_always_push
: Falsegradient_checkpointing
: Falsegradient_checkpointing_kwargs
: Noneinclude_inputs_for_metrics
: Falseinclude_for_metrics
: []eval_do_concat_batches
: Truefp16_backend
: autopush_to_hub_model_id
: Nonepush_to_hub_organization
: Nonemp_parameters
:auto_find_batch_size
: Falsefull_determinism
: Falsetorchdynamo
: Noneray_scope
: lastddp_timeout
: 1800torch_compile
: Falsetorch_compile_backend
: Nonetorch_compile_mode
: Nonedispatch_batches
: Nonesplit_batches
: Noneinclude_tokens_per_second
: Falseinclude_num_input_tokens_seen
: Falseneftune_noise_alpha
: Noneoptim_target_modules
: Nonebatch_eval_metrics
: Falseeval_on_start
: Falseuse_liger_kernel
: Falseeval_use_gather_object
: Falseaverage_tokens_across_devices
: Falseprompts
: Nonebatch_sampler
: batch_samplermulti_dataset_batch_sampler
: round_robin
Training Logs
Epoch | Step | Training Loss |
---|---|---|
0.7042 | 500 | 0.9125 |
1.4085 | 1000 | 0.2277 |
2.1127 | 1500 | 0.1527 |
2.8169 | 2000 | 0.1009 |
0.7042 | 500 | 0.1098 |
1.4085 | 1000 | 0.0842 |
2.1127 | 1500 | 0.0553 |
2.8169 | 2000 | 0.0356 |
Framework Versions
- Python: 3.10.12
- Sentence Transformers: 3.3.1
- Transformers: 4.47.1
- PyTorch: 2.5.1+cu121
- Accelerate: 1.2.1
- Datasets: 3.2.0
- Tokenizers: 0.21.0
Citation
BibTeX
Sentence Transformers
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
MultipleNegativesRankingLoss
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
- Downloads last month
- 13
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for trongvox/Phobert-Sentence-2
Base model
vinai/phobert-base