Spaces:
wuhp
/
Running on Zero

myr1-2 / app.py
wuhp's picture
Update app.py
9c07ccc verified
raw
history blame
1.27 kB
import gradio as gr
import spaces
import transformers_gradio
# Load *only* your model's interface.
#
# The original snippet loaded three models:
# demo = gr.load(name="deepseek-ai/DeepSeek-R1-Distill-Qwen-32B", src=transformers_gradio.registry)
# demo = gr.load(name="deepseek-ai/DeepSeek-R1", src=transformers_gradio.registry)
# demo = gr.load(name="deepseek-ai/DeepSeek-R1-Zero", src=transformers_gradio.registry)
#
# But we want the same UI, using *your* model from Hugging Face. So we do a single gr.load(...) call:
#
# IMPORTANT:
# 1) "name" should be the exact repository you want to load.
# 2) If your UI code was stored as a "Space" with a 'app.py' or 'api' in your "wuhp/myr1" repo,
# this approach should pull that same Gradio interface.
# 3) If "transformers_gradio.registry" is correct for your space, keep it.
# Otherwise, you might need "src='spaces'" or a different source, depending on how your space is set up.
demo = gr.load(
name="wuhp/myr1",
src="transformers_gradio.registry"
)
# If you want GPU usage (like the original snippet):
demo.fn = spaces.GPU()(demo.fn)
# Remove API names (like the original snippet):
for fn in demo.fns.values():
fn.api_name = False
if __name__ == "__main__":
demo.launch()