Text Generation
Transformers
PyTorch
mpt
Composer
MosaicML
llm-foundry
custom_code
text-generation-inference

How can I run it in oobabooga / text-generation-webui

#18
by victorx98 - opened

is there any easy way to test run and try it out?

Thanks

We haven't tried ourselves, but this video https://www.youtube.com/watch?v=QVVb6Md6huA walks through running the model in oobabooga. Let us know if it works, and we'd be happy to add the video link to our community section for others that have the same question!

Just added that video to the Model Card, so I'll close this issue for now.

atrott changed discussion status to closed

Sign up or log in to comment