database connection ?
#4
by
nobitha
- opened
Potentially yes, using something like LangChain. But you would need to investigate how to use a GPTQ model with LangChain; I've not looked into that myself yet.
Hi @nobitha , I did't connect to a database, but the first step of the idea of @TheBloke I did (use GPTQ model within Langchain), see below for a rough code snippet (no parameters, device or model name selected, you could do it like described in the readme). Maybe you can continue from there on.
from auto_gptq import AutoGPTQForCausalLM
from transformers import AutoTokenizer, pipeline
from langchain import HuggingFacePipeline
tokenizer = AutoTokenizer.from_pretrained(model_name_or_path)
model = AutoGPTQForCausalLM.from_quantized(model_name_or_path,model_basename=model_basename)
pipe = pipeline("text-generation",model=model,tokenizer=tokenizer)
llm = HuggingFacePipeline(pipeline=pipe)
The llm
object can then be used in any Langchain function which needs llm
input, e.g. LLMChain
or ConversationChain
. Maybe that helps or someone has a more direct way?