limcheekin's picture
feat: added index.html for space main screen
c5d5f04
|
raw
history blame
1.05 kB
metadata
title: orca_mini_v3_13B-GGML (q5_K_S)
colorFrom: purple
colorTo: blue
sdk: docker
app_file: index.html
models:
  - TheBloke/orca_mini_v3_13B-GGML
tags:
  - inference api
  - openai-api compatible
  - llama-cpp-python
  - orca_mini_v3_13B
  - ggml
pinned: false

orca_mini_v3_13B-GGML (q5_K_S)

With the utilization of the llama-cpp-python package, we are excited to introduce the GGML model hosted in the Hugging Face Docker Spaces, made accessible through an OpenAI-compatible API. This space includes comprehensive API documentation to facilitate seamless integration.

If you find this resource valuable, your support in the form of starring the space would be greatly appreciated. Your engagement plays a vital role in furthering the application for a community GPU grant, ultimately enhancing the capabilities and accessibility of this space.