nbeerbower's picture
Upload folder using huggingface_hub
bd34cfc verified
|
raw
history blame
824 Bytes
---
base_model:
- mistralai/Mistral-Nemo-Instruct-2407
- natong19/Mistral-Nemo-Instruct-2407-abliterated
library_name: transformers
tags:
- mergekit
- peft
---
# Untitled LoRA Model (1)
This is a LoRA extracted from a language model. It was extracted using [mergekit](https://github.com/arcee-ai/mergekit).
## LoRA Details
This LoRA adapter was extracted from [natong19/Mistral-Nemo-Instruct-2407-abliterated](https://huggingface.co/natong19/Mistral-Nemo-Instruct-2407-abliterated) and uses [mistralai/Mistral-Nemo-Instruct-2407](https://huggingface.co/mistralai/Mistral-Nemo-Instruct-2407) as a base.
### Parameters
The following command was used to extract this LoRA adapter:
```sh
mergekit-extract-lora natong19/Mistral-Nemo-Instruct-2407-abliterated mistralai/Mistral-Nemo-Instruct-2407 OUTPUT_PATH --rank=64
```