metadata
license: apache-2.0
![](https://huggingface.co/cognitivecomputations/fc-dolphin-2.6-mistral-7b-dpo-laser/resolve/main/fc-dolphin.jpg)
Sponsored by: VAGO Solutions and HyperSpace.Ai
Join our Discord! https://discord.gg/cognitivecomputations
A function calling version of cognitivecomputations/dolphin-2.6-mistral-7b-dpo-laser
It follows the implementation of laserRMT @ https://github.com/cognitivecomputations/laserRMT and the novel training technique - we partially freeze the model according to a laser-like analysis (Official Paper soon) which effectively prevents the significant problem of language models forgetting previously acquired knowledge. This aspect is particularly crucial when attempting to teach the model specific skills, such as function calling.
We intend to be the first of a family of experimentations being carried out @ Cognitive Computations.