Note: This model may not perfectly replicate the numbers mentioned in the paper (\cite{chopra-ghosh-2021-term}) as unlike the original one it has been trained with lower batches sizes for fewer epochs.
The source code for training will son be made available in https://github.com/sohomghosh/FinSim_Financial_Hypernym_detection
@inproceedings{chopra-ghosh-2021-term,
title = "Term Expansion and {F}in{BERT} fine-tuning for Hypernym and Synonym Ranking of Financial Terms",
author = "Chopra, Ankush and
Ghosh, Sohom",
booktitle = "Proceedings of the Third Workshop on Financial Technology and Natural Language Processing (FinNLP@IJCAI 2021)",
month = "Aug ",
year = "2021",
address = "Online",
publisher = "-",
url = "https://aclanthology.org/2021.finnlp-1.8",
pages = "46--51",
}
Use the following code to import this in Transformers:
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("sohomghosh/LIPI_FinSim3_Hypernym")
model = AutoModel.from_pretrained("sohomghosh/LIPI_FinSim3_Hypernym")
#Using SentenceTransformers
from sentence_transformers import SentenceTransformer
model_finlipi = SentenceTransformer('sohomghosh/LIPI_FinSim3_Hypernym')
- Downloads last month
- 15
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.