Edit model card

RadBERT was continuously pre-trained on radiology reports from a BioBERT initialization.

Citation

@article{chambon_cook_langlotz_2022, 
  title={Improved fine-tuning of in-domain transformer model for inferring COVID-19 presence in multi-institutional radiology reports}, 
  DOI={10.1007/s10278-022-00714-8}, journal={Journal of Digital Imaging}, 
  author={Chambon, Pierre and Cook, Tessa S. and Langlotz, Curtis P.}, 
  year={2022}
} 
Downloads last month
962
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Datasets used to train StanfordAIMI/RadBERT