Fetching metadata from the HF Docker repository...
Change format sent
de2b038
-
gradio_cached_examples
Show iteration count and time used
-
20 Bytes
Assert RunPod env vars are setup before trying to use them
-
1.72 kB
Comment out llama-cpp-python installation command in Docker for HuggingFace Space
-
1.07 kB
Initial commit
-
6.1 kB
Readme: Note on Mistral API used, serverless backend for reliability
-
3.57 kB
Moves logging from app.py to chill.py
-
8.39 kB
Add log_to_jsonl function to data.py and remove duplicate function from utils.py
-
1.26 kB
Change format sent
-
233 Bytes
Expose json typed LLM interface for RunPod
-
3.5 kB
Move prompt strings and types to own file, reorder code a bit
-
1.17 kB
Rename serverless test file, set default model to Phi 2 for test, removed jq install, and env vars that are set the same in utils.py already, ignore .cache in git
-
1.71 kB
Avoid unneeded imports, make serverless output more sensible, removing some debugging and comments
-
4.11 kB
Documents serverless motivation and testing instructions
-
1.16 kB
Rename serverless test file, set default model to Phi 2 for test, removed jq install, and env vars that are set the same in utils.py already, ignore .cache in git
-
3.21 kB
Add system map and worker architecture details
-
9.62 kB
Add log_to_jsonl function to data.py and remove duplicate function from utils.py