PATTARA TIPAKSORN commited on
Commit
25bc5ad
1 Parent(s): 0a24280

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -1
README.md CHANGED
@@ -61,11 +61,14 @@ with torch.no_grad():
61
  )
62
  print(response[0])
63
  ```
 
 
 
64
  ## Limitations and Future Work
65
  At present, our model remains in the experimental research phase and is not yet fully suitable for practical applications as an assistant. Future work will focus on upgrading the language model to a newer version ([OpenThaiLLM-DoodNiLT-V1.0.0-Beta-7B](https://huggingface.co/nectec/OpenThaiLLM-DoodNiLT-V1.0.0-Beta-7B)), and curating more refined and robust datasets to improve performance. Additionally, we aim to address and prioritize the safety and reliability of the model's outputs.
66
 
67
  ## Citation
68
- More information needed
69
 
70
  ## Acknowledgements
71
  We are grateful to ThaiSC, also known as NSTDA Supercomputer Centre, for providing the LANTA that was utilised for model training and finetuning. Additionally, we would like to express our gratitude to the SALMONN team for making their code publicly available, and to Typhoon Audio at SCB 10X for making available the huggingface project, source code, and technical paper, which served as a valuable guide for us. Many other open-source projects have contributed valuable information, code, data, and model weights; we are grateful to them all.
 
61
  )
62
  print(response[0])
63
  ```
64
+ ## Evaluation Performance
65
+ Additional details are required.
66
+
67
  ## Limitations and Future Work
68
  At present, our model remains in the experimental research phase and is not yet fully suitable for practical applications as an assistant. Future work will focus on upgrading the language model to a newer version ([OpenThaiLLM-DoodNiLT-V1.0.0-Beta-7B](https://huggingface.co/nectec/OpenThaiLLM-DoodNiLT-V1.0.0-Beta-7B)), and curating more refined and robust datasets to improve performance. Additionally, we aim to address and prioritize the safety and reliability of the model's outputs.
69
 
70
  ## Citation
71
+ Additional details are required.
72
 
73
  ## Acknowledgements
74
  We are grateful to ThaiSC, also known as NSTDA Supercomputer Centre, for providing the LANTA that was utilised for model training and finetuning. Additionally, we would like to express our gratitude to the SALMONN team for making their code publicly available, and to Typhoon Audio at SCB 10X for making available the huggingface project, source code, and technical paper, which served as a valuable guide for us. Many other open-source projects have contributed valuable information, code, data, and model weights; we are grateful to them all.