FactCG for Large Language Model Ungrounded Hallucination Detection

This is a fact-checking model from our work:

๐Ÿ“ƒ FactCG: Enhancing Fact Checkers with Graph-Based Multi-Hop Data (NAACL2025, GitHub Repo)

You can load our model with the following example code:

from transformers import AutoTokenizer, AutoConfig, AutoModelForSequenceClassification
config = AutoConfig.from_pretrained("yaxili96/FactCG-DeBERTa-v3-Large", num_labels=2, finetuning_task="text-classification", revision='main', token=None, cache_dir="./cache")
config.problem_type = "single_label_classification"
tokenizer = AutoTokenizer.from_pretrained("yaxili96/FactCG-DeBERTa-v3-Large", use_fast=True, revision='main', token=None, cache_dir="./cache")
model = AutoModelForSequenceClassification.from_pretrained(
            "yaxili96/FactCG-DeBERTa-v3-Large", config=config, revision='main', token=None, ignore_mismatched_sizes=False, cache_dir="./cache")

If you find the repository or FactCG helpful, please cite the following paper

@inproceedings{lei2025factcg,
  title={FactCG: Enhancing Fact Checkers with Graph-Based Multi-Hop Data},
  author={Lei, Deren and Li, Yaxi and Li, Siyao and Hu, Mengya and Xu, Rui and Archer, Ken and Wang, Mingyu and Ching, Emily and Deng, Alex},
  journal={NAACL},
  year={2025}
}
Downloads last month
19
Safetensors
Model size
435M params
Tensor type
F32
ยท
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.

Model tree for yaxili96/FactCG-DeBERTa-v3-Large

Finetuned
(134)
this model