yeniguno's picture
Update README.md
b5585f5 verified
|
raw
history blame
1.58 kB
metadata
library_name: transformers
tags:
  - tapas
  - table
  - question
license: mit
language:
  - en
base_model:
  - google/tapas-base-finetuned-wtq
pipeline_tag: table-question-answering

This is an experimental model fine-tuned on various balance sheets collected from financial services. The fine-tuning process was designed to adapt the TAPAS model to handle large numeric values and complex financial data structures commonly found in balance sheets.

How to Get Started with the Model

Use the code below to get started with the model.

# !pip install sugardata

from transformers import TapasTokenizer, TapasForQuestionAnswering
from sugardata.utility.tapas import generate_financial_balance_sheet, get_real_tapas_answer

# generate a financial balance sheet and ask a question
table = generate_financial_balance_sheet()

question = "What was the reported value of Total Debt in 2021?"

# load the model and tokenizer
model_name = "yeniguno/tapas-base-wtq-balance-sheet-tuned"
model = TapasForQuestionAnswering.from_pretrained(model_name)
tokenizer = TapasTokenizer.from_pretrained(model_name)

inputs = tokenizer(table=table, queries=[question], padding="max_length", return_tensors="pt")

# get the answer
answer = get_real_tapas_answer(table, model, tokenizer, inputs)

# 8873000.0

Training Details

Epoch [1/5] Train Loss: 0.1514 Val Loss: 0.0107 Epoch [2/5] Train Loss: 0.0135 Val Loss: 0.0098 Epoch [3/5] Train Loss: 0.0116 Val Loss: 0.0081 Epoch [4/5] Train Loss: 0.0081 Val Loss: 0.0071 Epoch [5/5] Train Loss: 0.0049 Val Loss: 0.0043