sleepdeprived3 commited on
Commit
0517358
·
verified ·
1 Parent(s): 49ae439

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +42 -35
README.md CHANGED
@@ -7,52 +7,59 @@ tags:
7
  - Theology
8
  - Jesus
9
  - Seminary
10
- - llama-cpp
11
- - gguf-my-repo
12
  pipeline_tag: text-generation
13
- base_model: sleepdeprived3/Reformed-Christian-Bible-Expert-12B
14
  ---
15
 
16
- # sleepdeprived3/Reformed-Christian-Bible-Expert-12B-Q5_K_S-GGUF
17
- This model was converted to GGUF format from [`sleepdeprived3/Reformed-Christian-Bible-Expert-12B`](https://huggingface.co/sleepdeprived3/Reformed-Christian-Bible-Expert-12B) using llama.cpp via the ggml.ai's [GGUF-my-repo](https://huggingface.co/spaces/ggml-org/gguf-my-repo) space.
18
- Refer to the [original model card](https://huggingface.co/sleepdeprived3/Reformed-Christian-Bible-Expert-12B) for more details on the model.
19
 
20
- ## Use with llama.cpp
21
- Install llama.cpp through brew (works on Mac and Linux)
22
 
23
- ```bash
24
- brew install llama.cpp
25
 
26
- ```
27
- Invoke the llama.cpp server or the CLI.
 
 
 
28
 
29
- ### CLI:
30
- ```bash
31
- llama-cli --hf-repo sleepdeprived3/Reformed-Christian-Bible-Expert-12B-Q5_K_S-GGUF --hf-file reformed-christian-bible-expert-12b-q5_k_s.gguf -p "The meaning to life and the universe is"
32
- ```
33
 
34
- ### Server:
35
- ```bash
36
- llama-server --hf-repo sleepdeprived3/Reformed-Christian-Bible-Expert-12B-Q5_K_S-GGUF --hf-file reformed-christian-bible-expert-12b-q5_k_s.gguf -c 2048
 
 
 
 
 
 
 
37
  ```
38
 
39
- Note: You can also use this checkpoint directly through the [usage steps](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#usage) listed in the Llama.cpp repo as well.
40
-
41
- Step 1: Clone llama.cpp from GitHub.
42
  ```
43
- git clone https://github.com/ggerganov/llama.cpp
44
  ```
45
 
46
- Step 2: Move into the llama.cpp folder and build it with `LLAMA_CURL=1` flag along with other hardware-specific flags (for ex: LLAMA_CUDA=1 for Nvidia GPUs on Linux).
47
- ```
48
- cd llama.cpp && LLAMA_CURL=1 make
49
- ```
50
 
51
- Step 3: Run inference through the main binary.
52
- ```
53
- ./llama-cli --hf-repo sleepdeprived3/Reformed-Christian-Bible-Expert-12B-Q5_K_S-GGUF --hf-file reformed-christian-bible-expert-12b-q5_k_s.gguf -p "The meaning to life and the universe is"
54
- ```
55
- or
56
- ```
57
- ./llama-server --hf-repo sleepdeprived3/Reformed-Christian-Bible-Expert-12B-Q5_K_S-GGUF --hf-file reformed-christian-bible-expert-12b-q5_k_s.gguf -c 2048
58
- ```
 
 
 
 
 
 
 
 
 
 
 
 
 
7
  - Theology
8
  - Jesus
9
  - Seminary
 
 
10
  pipeline_tag: text-generation
 
11
  ---
12
 
13
+ # Reformed Christian Bible Expert
 
 
14
 
15
+ A specialized language model fine-tuned for Reformed theology and biblical studies. Based on `mistralai/Mistral-Nemo-Instruct-2407` for superior theological reasoning with a **128k token context window**.
 
16
 
17
+ ## Features
 
18
 
19
+ - 🕊️ Answers theological questions from a Reformed/Calvinist perspective
20
+ - ✝️ Explains biblical passages with historical-grammatical hermeneutics
21
+ - 🎓 Assists with seminary studies and sermon preparation
22
+ - 💬 Can roleplay as a pastor for counseling scenarios
23
+ - 📜 Inherits 128k context window from base model
24
 
25
+ ## Usage
 
 
 
26
 
27
+ **Chat Template:** Mistral V3 Tekken
28
+ **Recommended Settings:**
29
+ ```python
30
+ {
31
+ "temperature": 0,
32
+ "top_k": 1,
33
+ "top_p": 0,
34
+ "min_p": 0,
35
+ "repetition_penalty": 1.18
36
+ }
37
  ```
38
 
39
+ **Example Prompt:**
 
 
40
  ```
41
+ [INST] Explain the doctrine of justification by faith alone from Romans 3:28 [/INST]
42
  ```
43
 
44
+ ## Quantized Formats
 
 
 
45
 
46
+ - **EXL2 Collection**:
47
+ [Reformed-Christian-Bible-Expert EXL2 Models](https://huggingface.co/collections/sleepdeprived3/reformed-christian-bible-expert-exl2-67ace8acd900c8cadd4c2a4e)
48
+
49
+ - **GGUF Collection**:
50
+ [Reformed-Christian-Bible-Expert GGUF Models](https://huggingface.co/collections/sleepdeprived3/reformed-christian-bible-expert-gguf-67ace8b70d16eec807037c6e)
51
+
52
+ ## Training Details
53
+
54
+ - **Base Model**: `mistralai/Mistral-Nemo-Instruct-2407` (128k context)
55
+ - **Fine-Tuning**: QLoRA on curated Reformed theological texts
56
+ - **License**: Apache 2.0
57
+
58
+ ## Ethical Considerations
59
+
60
+ This model is designed to:
61
+ - Affirm the authority of Scripture (2 Tim 3:16)
62
+ - Uphold the Westminster Standards
63
+ - Avoid speculative theology
64
+
65
+ *Soli Deo Gloria*