limcheekin's picture
docs: updated readme
5b3f8fc
|
raw
history blame
1.06 kB
metadata
title: orca_mini_v3_13B-GGML (q5_K_S)
colorFrom: purple
colorTo: blue
sdk: docker
models: TheBloke/orca_mini_v3_13B-GGML
tags:
  - inference api
  - openai-api compatible
  - llama-cpp-python
  - orca_mini_v3_13B
  - ggml
pinned: false

orca_mini_v3_13B-GGML (q5_K_S)

With the utilization of the llama-cpp-python package, we are excited to introduce the GGML model hosted in the Hugging Face Docker Spaces, made accessible through an OpenAI-compatible API. This space includes comprehensive API documentation to facilitate seamless integration.

The API endpoint: https://limcheekin-orca_mini_v3_13B-GGML.hf.space/v1

Explore the API through the following documentation: https://limcheekin-orca_mini_v3_13B-GGML.hf.space/docs

If you find this resource valuable, your support in the form of starring the space would be greatly appreciated. Your engagement plays a vital role in furthering the application for a community GPU grant, ultimately enhancing the capabilities and accessibility of this space.