File size: 1,402 Bytes
753a03e
 
 
 
 
 
 
 
 
 
 
 
4e78f32
753a03e
5aa03f3
 
4e78f32
753a03e
5960ac7
4e78f32
6c5c3d3
4e78f32
 
 
8e55251
4e78f32
5aa03f3
69ae1bf
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
---
license: gpl-2.0
language:
- en
library_name: transformers
pipeline_tag: text-generation
tags:
- model
- novelai
- retired
---

# Calliope (NovelAI Legacy Model)

![callibye](https://user-images.githubusercontent.com/82650881/275655858-0e9a9c85-1a1f-4bea-83f3-a893a2cb76e0.png)

A while ago, we retired our oldest model, Calliope, which was based on [EleutherAI's GPT-Neo 2.7B](https://huggingface.co/EleutherAI/gpt-neo-2.7B). This model was our first step into finetuning large language models. We made some messes and learned a lot from the experience of creating her.

Right when NovelAI came out of its private alpha test, EleutherAI released GPT-J 6B, which was a much stronger model and became the basis of our Sigurd model, so Calliope barely had any time to shine and spent most of her time sitting unused in our model selection list.

After the release of Clio, Calliope wasn't even the fastest model anymore, so it became time to pull the plug and retire the model.

So now, for the sake of nostalgia, posterity, and historic preservation, we are releasing the weights of our Calliope model publicly on Huggingface Hub under the GPL-2.0 license.

You can find your very own Calliope right here where you are!

![Calliope](https://user-images.githubusercontent.com/82650881/275655819-ab0b9eae-c2ff-4181-a918-9f6dd8aa7b14.jpg)

Art by @[illustratesTarm](https://twitter.com/illustratesTarm)