ryzen88's picture
Update README.md
f03c341 verified
|
raw
history blame
2.14 kB
metadata
base_model: []
library_name: transformers
tags:
  - mergekit
  - merge
  - llama-3
  - 70b
  - smaug
  - lumimaid
  - tess
  - arimas
  - breadcrums

This is really a followup and improvement off my original Lumi-Tess model.

I have noticed that the quant versions have a very poor context windows. So i made a newer version with gradent in stead of giraffe so the context window will hopefully remain much better with lower quant sizes. https://huggingface.co/ryzen88/Llama-3-70b-Arimas-story-RP-V1.5

model

A large context (128K) uncencored Llama 3 instruct model focussed on story & RP. I found the Smaug version of lama very impressive, exept for a couple of quirks and the default context window. This merge is with the Giraffe instruct for long context window, and basically a smaug - lumi tess merger. I am planning to do the same with a gradient model and compaire it to this giraffe version. Breadcrumbs_ties really is awesome.

This is a merge of pre-trained language models created using mergekit. A big thanks to the creators of the models used in this merge

Merge Details

Merge Method

This model was merged using the breadcrumbs_ties merge method using Z:\Llama-3-Giraffe-70B-Instruct as a base. This model thanks to Giraffe have an effective context length of approximately 128k.

Models Merged

The following models were included in the merge:

  • \Smaug-Llama-3-70B-Instruct
  • \Llama-3-Lumimaid-70B-v0.1-alt
  • \Tess-2.0-Llama-3-70B-v0.2

Configuration

The following YAML configuration was used to produce this model:

models:
  - model: \Llama-3-Giraffe-70B-Instruct
    parameters:
      weight: 0.25
      density: 0.90
      gamma: 0.01
  - model: \Smaug-Llama-3-70B-Instruct
    parameters:
      weight: 0.30
      density: 0.90
      gamma: 0.01
  - model: \Tess-2.0-Llama-3-70B-v0.2
    parameters:
      weight: 0.15
      density: 0.90
      gamma: 0.01
  - model: \Llama-3-Lumimaid-70B-v0.1-alt
    parameters:
      weight: 0.30
      density: 0.90
      gamma: 0.01
merge_method: breadcrumbs_ties
base_model: \Llama-3-Giraffe-70B-Instruct
dtype: bfloat16