Edit model card

Adonalsium-Mistral-Adapters

Basic Information

Overview

This LLM was trained as an attempt to generate text that mirrors the complex narrative and character interactions of Brandon Sanderson's Cosmere series, aiming to enhance creative storytelling and facilitate academic research in narrative analysis.

Data Source

The model leverages a simple dataset derived from the entire Cosmere series, enriched by dynamic visualizations that highlight the complex interplay of relationships and interactions within the Cosmere universe.

Technical Details

  • Environment: Google Colab, PEFT 0.8.2, tesla T4
  • Architecture: Mistral instruct tailored for Cosmere narratives
  • Adapters: Adapters model
  • Training Data: Cosmere series
Step Training Loss
500 0.018500
1000 0.000000

Detailed technical specifics, including architecture choices, hyperparameters, and training methodologies, are documented in the accompanying training and generation notebook.

Access and Usage

Test and research with this model within the text generation web UI colab. The model is accessible for use as detailed in the training and generation notebook.

Downloads last month
15
Safetensors
Model size
7.24B params
Tensor type
FP16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for JakeTurner616/Adonalsium-Mistral-7b-v0.1

Finetuned
(685)
this model
Quantizations
2 models