How to run Sentient’s Dobby-Mini locally with Ollama

Community Article Published January 30, 2025

✌️ What is Dobby-mini?

Sentient just released two demo models for our upcoming Dobby model family:

  • Dobby-Mini-Leashed-Llama-3.1-8B
  • Dobby-Mini-Unhinged-Llama-3.1-8B

These models are fine-tuned from Llama-3.1-8B-Instruct and are pro- personal freedom, pro-decentralization, and pro- all things crypto. Both models have very strong personalities (especially the unhinged version) and can be used to build your next craziest AI idea.

👉 Pre-Requisites

  1. Download the models from Sentient's HuggingFace
  2. Follow along with my companion video on YouTube

✍️ Instructions

1. Download the model

The models are available in GGUF and safetensors formats. Safetensors is the model equivalent of a raw photo. It is great for editors (fine-tuning) but unnecessary for typical use. GGUF is the model equivalent of a jpeg photo. It is compressed (quantized) and more suitable for everyday use. We will use the GGUF format.

image/png

Once you’ve picked between Leashed and Unhinged, navigate to Files and versions. There are multiple different quantization levels available. Quantization is measured in bits. A lower quantization means that the model is more compressed, requires fewer resources to run, and has a smaller file size. However, even low quantization models can be quite large, so it may take a few minutes for the model to download. We will use the 4-bit quantization because it is the most lightweight. Download it using the download button.

image/png

2. Download Ollama

To run the model locally, we suggest downloading Ollama here. Once downloaded, open the Ollama application to install its command line interface (CLI).

3. Create Modelfile

We need to create a Modelfile for our model to run it using Ollama. We can do that from the terminal using vim. The following command will create a new, empty model file and open it in vim. Run it from the same directory that the model is in:

vim Modelfile

Now we need to add the FROM keyword followed by the relative path to the GGUF file. If you are using Dobby-Mini Unhinged and you are in the same directory that the model is in, that will look as follows:

FROM ./dobby-8b-unhinged-q4_k_m.gguf

To exit vim and save the changes that you have made, first press the Esc key and then type :x .

4. Run the model

Create and name the model using the Modelfile:

ollama create dobby-unhinged -f Modelfile

Run the model:

ollama run dobby-unhinged  

Now you are ready to query away!

🧠 About Sentient

Sentient is a research organization focused on making AI open, transparent, and aligned with the communities it serves. Our goal of Loyal AI represents this vision—AI that is community-built, community-owned, community-aligned, and community-controlled.

We want to ensure that when AGI is created, it is Loyal—not to corporations, but to humanity.

Follow us for the latest research updates and releases!

Community

Sign up or log in to comment