|
--- |
|
license: apache-2.0 |
|
--- |
|
by David, Fernando and Eric |
|
|
|
Sponsored by: [VAGO Solutions](https://vago-solutions.de) and [HyperSpace.Ai](https://hyperspace.computer/) |
|
|
|
Join our Discord! https://discord.gg/cognitivecomputations |
|
|
|
A function calling version of [cognitivecomputations/dolphin-2.6-mistral-7b-dpo-laser(https://huggingface.co/cognitivecomputations/dolphin-2.6-mistral-7b-dpo-laser) |
|
|
|
It follows the implementation of laserRMT @ https://github.com/cognitivecomputations/laserRMT and the novel training technique - we partially freeze the model according to a laser-like analysis (Official Paper soon) |
|
which effectively prevents the significant problem of language models forgetting previously acquired knowledge. This aspect is particularly crucial when attempting to teach the model specific skills, such as function calling. |
|
|
|
We intend to be the first of a family of experimentations being carried out @ Cognitive Computations. |