使用GPT3.5模型构建油管频道问答机器人
在 chatgpt api(也就是 GPT-3.5-Turbo)模型出来后,因钱少活好深受大家喜爱,所以 LangChain 也加入了专属的链和模型,我们来跟着这个例子看下如何使用他。
import osfrom langchain.document_loaders import YoutubeLoaderfrom langchain.embeddings.openai import OpenAIEmbeddingsfrom langchain.vectorstores import Chromafrom langchain.text_splitter import RecursiveCharacterTextSplitterfrom langchain.chains import ChatVectorDBChain, ConversationalRetrievalChainfrom langchain.chat_models import ChatOpenAIfrom langchain.prompts.chat import (ChatPromptTemplate,SystemMessagePromptTemplate,HumanMessagePromptTemplate)# 加载 youtube 频道loader = YoutubeLoader.from_youtube_url('https://www.youtube.com/watch?v=Dj60HHy-Kqk')# 将数据转成 documentdocuments = loader.load()# 初始化文本分割器text_splitter = RecursiveCharacterTextSplitter(chunk_size=1000,chunk_overlap=20)# 分割 youtube documentsdocuments = text_splitter.split_documents(documents)# 初始化 openai embeddingsembeddings = OpenAIEmbeddings()# 将数据存入向量存储vector_store = Chroma.from_documents(documents, embeddings)# 通过向量存储初始化检索器retriever = vector_store.as_retriever()system_template = """Use the following context to answer the user's question.If you don't know the answer, say you don't, don't try to make it up. And answer in Chinese.-----------{question}-----------{chat_history}"""# 构建初始 messages 列表,这里可以理解为是 openai 传入的 messages 参数messages = [SystemMessagePromptTemplate.from_template(system_template),HumanMessagePromptTemplate.from_template('{question}')]# 初始化 prompt 对象prompt = ChatPromptTemplate.from_messages(messages)# 初始化问答链qa = ConversationalRetrievalChain.from_llm(ChatOpenAI(temperature=0.1,max_tokens=2048),retriever,condense_question_prompt=prompt)chat_history = []while True:question = input('问题:')# 开始发送问题 chat_history 为必须参数,用于存储对话历史result = qa({'question': question, 'chat_history': chat_history})chat_history.append((question, result['answer']))print(result['answer'])
我们可以看到他能很准确的围绕这个油管视频进行问答

使用流式回答也很方便
from langchain.callbacks.base import CallbackManagerfrom langchain.callbacks.streaming_stdout import StreamingStdOutCallbackHandlerchat = ChatOpenAI(streaming=True, callback_manager=CallbackManager([StreamingStdOutCallbackHandler()]), verbose=True, temperature=0)resp = chat(chat_prompt_with_values.to_messages())
当前内容版权归 liaokongVFX 或其关联方所有,如需对内容或内容相关联开源项目进行关注与资助,请访问 liaokongVFX .