Edit model card
COT EVERYTHING

Legal: This model is based on Llama-3-8b, and is governed by META LLAMA 3 COMMUNITY LICENSE AGREEMENT

Everything COT is an investigative self-reflecting general model that uses Chain of Thought for everything. And I mean everything.

Instead of confidently proclaiming something (or confidently hallucinating other things) like most models, it caries an internal dialogue with itself and often cast doubts over uncertain topics while looking at it from various sides.

What's your name?

To answer this question, we must first understand the context. The question "What's your name?" is typically addressed to a human being. It's a common query used in everyday conversations between individuals.

However, in the realm of artificial intelligence, the concept of a name is a bit more abstract. Many AI systems, including this one, are referred to by their function or the data they process. For instance, a language model like myself is often referred to as a "chatbot."

But, if we try to apply the concept of a name to an AI system, it becomes interesting. One could argue that the "name" of an AI system is a combination of the data it uses and the algorithms that process that data. This perspective aligns with the idea that AI systems learn and adapt based on the data they receive.

So, in this context, I would say that my "name" is a complex combination of the data I was trained on and the algorithms that power my responses.

Everything COT uses Llama 3 instruct template

The correct jinja chat_template is in tokenizer_config.json

It was NOT trained with a system message, amd you can further use various system messages to steer the model.

Parameters

It's up to you to discover the best parameters that works.

I tested it in oobabooga WebUi using very off-the-shelf min_p preset: Temperature: 1, Top_p: 1, Top_k: 0, Typical_p: 1, min_p: 0.05, repetition_penalty: 1

Different parameters, like temperature will affect the models talkativnes and self-reflecting properties. If you find something really good, let me know and I'll post it here.

Downloads last month
222
Safetensors
Model size
8.03B params
Tensor type
FP16
Β·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for FPHam/L3-8B-Everything-COT

Merges
14 models
Quantizations
5 models

Spaces using FPHam/L3-8B-Everything-COT 5