limcheekin commited on
Commit
5b3f8fc
·
1 Parent(s): 12be50e

docs: updated readme

Browse files
Files changed (1) hide show
  1. README.md +18 -4
README.md CHANGED
@@ -1,12 +1,26 @@
1
  ---
2
- title: Test
3
- emoji: 👀
4
  colorFrom: purple
5
  colorTo: blue
6
  sdk: docker
 
 
 
 
 
 
 
7
  pinned: false
8
  ---
9
 
10
- Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference
11
 
12
- https://limcheekin-test.hf.space/v1/
 
 
 
 
 
 
 
 
 
1
  ---
2
+ title: orca_mini_v3_13B-GGML (q5_K_S)
 
3
  colorFrom: purple
4
  colorTo: blue
5
  sdk: docker
6
+ models: TheBloke/orca_mini_v3_13B-GGML
7
+ tags:
8
+ - inference api
9
+ - openai-api compatible
10
+ - llama-cpp-python
11
+ - orca_mini_v3_13B
12
+ - ggml
13
  pinned: false
14
  ---
15
 
16
+ # orca_mini_v3_13B-GGML (q5_K_S)
17
 
18
+ With the utilization of the [llama-cpp-python](https://github.com/abetlen/llama-cpp-python) package, we are excited to introduce the GGML model hosted in the Hugging Face Docker Spaces, made accessible through an OpenAI-compatible API. This space includes comprehensive API documentation to facilitate seamless integration.
19
+
20
+ The API endpoint:
21
+ https://limcheekin-orca_mini_v3_13B-GGML.hf.space/v1
22
+
23
+ Explore the API through the following documentation:
24
+ https://limcheekin-orca_mini_v3_13B-GGML.hf.space/docs
25
+
26
+ If you find this resource valuable, your support in the form of starring the space would be greatly appreciated. Your engagement plays a vital role in furthering the application for a community GPU grant, ultimately enhancing the capabilities and accessibility of this space.