Edit model card

Monstral 123B

A Mistral-Large merge image/png

This model is a slerp merge of Behemoth and Magnum V4. The intention was to moisten up Behemoth a bit and give it some of that Claude flavor, but without being nearly as thirsty as Magnum. I feel it succeeds in both areas.

Mergefuel:

  • TheDrummer/Behemoth-123B-v1
  • anthracite-org/magnum-v4-123b

See recipe.txt for full details.

This model is uncensored and perfectly capable of generating objectionable material. It is far less likely to return NSFW content for SFW prompts than Magnum V4, but you should still exercise caution. As with any LLM, no factual claims made by the model should be taken at face value. You know that boilerplate safety disclaimer that most professional models have?
Assume this has it too. This model is for entertainment purposes only.

GGUFs: https://huggingface.co/MarsupialAI/Monstral-123B_iMat_GGUF

EXL2: https://huggingface.co/MarsupialAI/Monstral-123B_4.0bpw_EXL2

Prompt Format

Mistral or Metharme

Downloads last month
162
Safetensors
Model size
123B params
Tensor type
FP16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for MarsupialAI/Monstral-123B

Quantizations
4 models

Collection including MarsupialAI/Monstral-123B