Error with the custom BERT config
#1
by
shreyansh26
- opened
On trying the code in the README -
from transformers import AutoModelForMaskedLM, BertTokenizer, pipeline
tokenizer = BertTokenizer.from_pretrained('bert-base-uncased')
mlm = AutoModelForMaskedLM.from_pretrained('mosaicml/mosaic-bert-base-seqlen-1024', trust_remote_code=True)
classifier = pipeline('fill-mask', model=mlm, tokenizer=tokenizer)
print(classifier("I [MASK] to the store yesterday."))
I get the following error -
ValueError: The model class you are passing has a `config_class` attribute that is not consistent with the config class you passed (model has <class 'transformers.models.bert.configuration_bert.BertConfig'> and you passed <class 'transformers_modules.mosaicml.mosaic-bert-base-seqlen-1024.e4c6e6558dd1d88dbc0e14246e6339e394f95a3a.configuration_bert.BertConfig'>. Fix one of those so they match!