File size: 1,110 Bytes
d393d28
 
 
 
1cd4d5b
d393d28
bd92177
d393d28
939368f
 
d393d28
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
---
language:
- en
- zh
pipeline_tag: text-generation
---
## hongyin/informer-0.2b

I am pleased to introduce to you an English-Chinese bilingual autoregressive language model. This model is trained from scratch and has a unique vocabulary and 20 million parameters based on the LLAMA2 model structure. Our goal is to provide a solution that is computationally cheap and easy to inference. It's important to note that this is a base model, not intended to be used as a chatbot, but rather for alchemy. We look forward to providing you with a practical model product.

To put aside the damn high-sounding words, the name of each model has rich connotations and personal experience, including the previous model, and it is worth reminding us repeatedly.

```python

```

## Bibtex entry and citation info
Please cite if you find it helpful.
```
@article{zhu2023metaaid,
  title={MetaAID 2.0: An Extensible Framework for Developing Metaverse Applications via Human-controllable Pre-trained Models},
  author={Zhu, Hongyin},
  journal={arXiv preprint arXiv:2302.13173},
  year={2023}
}

```

---
license: other
---