--- license: apache-2.0 tags: - merge - mergekit - vilm/vinallama-7b-chat - vilm/vinallama-7b-chat - vilm/vinallama-7b-chat - vilm/vinallama-7b-chat - vilm/vinallama-7b-chat - vilm/vinallama-7b-chat - vilm/vinallama-7b-chat - vilm/vinallama-7b-chat --- # vinallama-chat-merge-3 This model is a merge of the following models made with [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing): * [vilm/vinallama-7b-chat](https://huggingface.co/vilm/vinallama-7b-chat) * [vilm/vinallama-7b-chat](https://huggingface.co/vilm/vinallama-7b-chat) * [vilm/vinallama-7b-chat](https://huggingface.co/vilm/vinallama-7b-chat) * [vilm/vinallama-7b-chat](https://huggingface.co/vilm/vinallama-7b-chat) * [vilm/vinallama-7b-chat](https://huggingface.co/vilm/vinallama-7b-chat) * [vilm/vinallama-7b-chat](https://huggingface.co/vilm/vinallama-7b-chat) * [vilm/vinallama-7b-chat](https://huggingface.co/vilm/vinallama-7b-chat) * [vilm/vinallama-7b-chat](https://huggingface.co/vilm/vinallama-7b-chat) ## 🧩 Configuration ```yaml slices: - sources: - model: vilm/vinallama-7b-chat layer_range: [0, 16] - sources: - model: vilm/vinallama-7b-chat layer_range: [8, 16] - sources: - model: vilm/vinallama-7b-chat layer_range: [8, 16] - sources: - model: vilm/vinallama-7b-chat layer_range: [16, 24] - sources: - model: vilm/vinallama-7b-chat layer_range: [16, 24] - sources: - model: vilm/vinallama-7b-chat layer_range: [24, 28] - sources: - model: vilm/vinallama-7b-chat layer_range: [24, 28] - sources: - model: vilm/vinallama-7b-chat layer_range: [28, 32] merge_method: passthrough dtype: bfloat16 ```