Error loading model using huggingface transformers
#12 opened about 1 month ago
by
yuemning
setting max model length to reasonable number / max_pos_encodings, e.g. 8192
#11 opened 3 months ago
by
michaelfeil

compress_ratios/compress_layer/cutoff_layers参数设置建议?
5
#9 opened 5 months ago
by
hulianxue
Finetuning script
1
#8 opened 5 months ago
by
jsmolen
Data Discrepancies: Inconsistent and Unreasonable Evaluation Results
1
#7 opened 6 months ago
by
AwesomeGPTs
技术报告询问
2
#6 opened 6 months ago
by
dingguofeng

Example for using with prompt
1
#5 opened 6 months ago
by
djstrong

如何实现多卡加载bge-reranker-v2.5-gemma2-lightweight
6
#4 opened 6 months ago
by
dingguofeng

找不到gemma_config.py
1
#2 opened 7 months ago
by
dunwu