_______       ___      .___  ___.   ______   .______      .______    __    __  
|       \     /   \     |   \/   |  /  __  \  |   _  \     |   _  \  |  |  |  | 
|  .--.  |   /  ^  \    |  \  /  | |  |  |  | |  |_)  |    |  |_)  | |  |__|  | 
|  |  |  |  /  /_\  \   |  |\/|  | |  |  |  | |      /     |   ___/  |   __   | 
|  '--'  | /  _____  \  |  |  |  | |  `--'  | |  |\  \----.|  |      |  |  |  | 
|_______/ /__/     \__\ |__|  |__|  \______/  | _| `._____|| _|      |__|  |__| 
                                                                               

DA-MIXED-LLAMA3.2

An experimental model built on the LLaMA-3.2 architecture, combining morphological and BPE tokenization strategies. This model investigates the effects of mixed tokenization on Danish language understanding and generation.

Downloads last month
177
Safetensors
Model size
1.24B params
Tensor type
FP16
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Collection including meelu/DA-MIXED-LLAMA3.2