File size: 1,943 Bytes
38e56fc
 
e4337bb
 
 
 
 
38e56fc
e4337bb
 
b693998
e4337bb
 
b693998
e4337bb
 
 
 
 
 
 
 
2fd1c5d
e4337bb
 
 
 
 
 
 
8dc4df7
e4337bb
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
---
license: apache-2.0
tags:
- mixtral
- llamafile
- llm
- moe
---


# Mixtral 8X7B Instruct v0.1 - Llamafile 🦙

## Overview
This model card describes the `mixtral-8x7b-instruct-v0.1.Q3_K_M.llamafile`, a single-file executable version of the Mixtral 8X7B Instruct v0.1 model. <br>
It is built upon the original work by TheBloke and Mistral AI, repackaged for ease of use as a standalone application.  <br>
See [here](https://huggingface.co/TheBloke/Mixtral-8x7B-Instruct-v0.1-GGUF)

Like many of you, i am GPU poor. The goal behind this approach was to have easy access to a good opensourced model with limited GPU resources, like a Macbook Pro M1 32GB.  <br>
It's not the full model, but it's the most feasible given the resource constraints - see [here](https://huggingface.co/TheBloke/Mixtral-8x7B-Instruct-v0.1-GGUF#provided-files) for notes on performance 


## Usage
Because the model is converted to `llamafile`, it can be executed on any OS with no additional installations required.Read more about llamafile [here](https://github.com/Mozilla-Ocho/llamafile). <br>
To use this model, ensure you have execution permissions set:

```bash
chmod +x mixtral-8x7b-instruct-v0.1.Q3_K_M.llamafile
./mixtral-8x7b-instruct-v0.1.Q3_K_M.llamafile
```

See [here](https://github.com/Mozilla-Ocho/llamafile/blob/6423228b5ddd4862a3ab3d275a168692dadf4cdc/llama.cpp/server/README.md) for local API server details.

## Credits and Acknowledgements
This executable is a derivative of TheBloke's original Mixtral model, repurposed for easier deployment. It is licensed under the same terms as TheBloke's model.

## Limitations
As with the original Mixtral model, this executable does not include moderation mechanisms and should be used with consideration for its capabilities and limitations.

## Additional Information
For more detailed instructions and insights, please refer to the original model documentation provided by TheBloke and Mistral AI.