I am getting this error ValidationError: 1 validation error for StuffDocumentsChain root document_variable_name context was not found in llm_chain input_variables: ['chat_history', 'user_query', 'relevant_context'] (type=value_error) while using load_qa_chain I have searched this error but did n't find anything related to this. Can anyone tell what I am missing here.
I dont know if HuggingFaceEndpoint is the correct. It suppose I'm replacing an OpenIA code with a Huggingface one and it running in the huggingface workspce as an app.
I'm using Huggingface and this is the code:
def chat_engine():
retriever = PDF_KnowledgeBase('./KnowledgeBase').return_retriever_from_persistant_vector_db()
huggingface_hub = HuggingFaceEndpoint(
endpoint_url="http://localhost:8010/",
max_new_tokens=512,
top_k=10,
top_p=0.95,
typical_p=0.95,
temperature=0.01,
repetition_penalty=1.03,
huggingfacehub_api_token="mytoken"
)
template = """
<template content>
"""
PROMPT = PromptTemplate(template=template, input_variables=['question'])
document_variable_name = "context"
# Set up question-answering chain
qa_chain = RetrievalQA.from_chain_type(
llm=huggingface_hub,
chain_type='stuff',
retriever=retriever,
chain_type_kwargs={"prompt": PROMPT},
return_source_documents=False
)
return qa_chain
I tried changing all the parameters without success