Not-For-All-Audiences
nsfw
icefog72 commited on
Commit
c312548
·
verified ·
1 Parent(s): 162593c

Update Summary for ST Beginners.md

Browse files
Files changed (1) hide show
  1. Summary for ST Beginners.md +1 -1
Summary for ST Beginners.md CHANGED
@@ -28,7 +28,7 @@ The fun or sad part starts when choosing a backend (the thing that will run LLM)
28
  If you have:
29
 
30
  * An NVIDIA card newer than 20xx RTX, then it's a no-brainer - use exl2 models.
31
- From the get-go, do this - open NVIDIA Control Panel -> Manage 3D Settings -> CUDA - Sysmem Fallback Policy : Prefer No Sysmem Fallback. If you want to know what it does -> Google it or just do it with -just trust me bro (newer do something like this, newer blindly trust, trust me bro newer).
32
  ![image/png](https://cdn-uploads.huggingface.co/production/uploads/63407b719dbfe0d48b2d763b/nOSAGiBAXFdnizAXp1fuK.png)
33
 
34
  Next, we can choose Ooba or TabbyAPI. For me, they are the same in use, but for some people, TabbyAPI is more stable. Ooba is easier to install.
 
28
  If you have:
29
 
30
  * An NVIDIA card newer than 20xx RTX, then it's a no-brainer - use exl2 models.
31
+ From the get-go, do this - open NVIDIA Control Panel -> Manage 3D Settings -> CUDA - Sysmem Fallback Policy : Prefer No Sysmem Fallback. If you want to know what it does -> Google it or just do it with -just trust me bro (newer do something like this, newer blindly trust, trust me bro, newer).
32
  ![image/png](https://cdn-uploads.huggingface.co/production/uploads/63407b719dbfe0d48b2d763b/nOSAGiBAXFdnizAXp1fuK.png)
33
 
34
  Next, we can choose Ooba or TabbyAPI. For me, they are the same in use, but for some people, TabbyAPI is more stable. Ooba is easier to install.