math-roberta / README.md
uf-aice-lab's picture
Create README.md
d0fa216
|
raw
history blame
721 Bytes

Math-RoBerta for NLP tasks in math learning environments

This model is fine-tuned with RoBERTa-large with over 3,000,000 posts and replies from students and instructors in Algebra Nation (https://www.mathnation.com/). It can potentially provide a good base performance on NLP related tasks in similar math learning environments.

Here is how to use it with texts in HuggingFace

from transformers import RobertaTokenizer, RobertaModel
tokenizer = RobertaTokenizer.from_pretrained('uf-aice-lab/math-roberta')
model = RobertaModel.from_pretrained('uf-aice-lab/math-roberta')
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)