metadata
license: apache-2.0
tags:
- jamba
datasets:
- openbmb/UltraInteract_pair
base_model: ai21labs/Jamba-v0.1
This Jamba model has been pruned to just 1B parameters. It was then trained on the first 50k examples of the Ultra Interact Pair dataset for Instruction based fine-tuning.
Initial tests work but may be inconsistent. More info and examples will be posted later
Training
- 50k Examples
- 6 hours x A100