mirror of
https://github.com/hwchase17/langchain
synced 2024-11-08 07:10:35 +00:00
79ed66f870
## Update 2023-09-08 This PR now supports further models in addition to Lllama-2 chat models. See [this comment](#issuecomment-1668988543) for further details. The title of this PR has been updated accordingly. ## Original PR description This PR adds a generic `Llama2Chat` model, a wrapper for LLMs able to serve Llama-2 chat models (like `LlamaCPP`, `HuggingFaceTextGenInference`, ...). It implements `BaseChatModel`, converts a list of chat messages into the [required Llama-2 chat prompt format](https://huggingface.co/blog/llama2#how-to-prompt-llama-2) and forwards the formatted prompt as `str` to the wrapped `LLM`. Usage example: ```python # uses a locally hosted Llama2 chat model llm = HuggingFaceTextGenInference( inference_server_url="http://127.0.0.1:8080/", max_new_tokens=512, top_k=50, temperature=0.1, repetition_penalty=1.03, ) # Wrap llm to support Llama2 chat prompt format. # Resulting model is a chat model model = Llama2Chat(llm=llm) messages = [ SystemMessage(content="You are a helpful assistant."), MessagesPlaceholder(variable_name="chat_history"), HumanMessagePromptTemplate.from_template("{text}"), ] prompt = ChatPromptTemplate.from_messages(messages) memory = ConversationBufferMemory(memory_key="chat_history", return_messages=True) chain = LLMChain(llm=model, prompt=prompt, memory=memory) # use chat model in a conversation # ... ``` Also part of this PR are tests and a demo notebook. - Tag maintainer: @hwchase17 - Twitter handle: `@mrt1nz` --------- Co-authored-by: Erick Friis <erick@langchain.dev> |
||
---|---|---|
.. | ||
agents | ||
autonomous_agents | ||
chat_models | ||
comprehend_moderation | ||
cpal | ||
data_anonymizer | ||
fallacy_removal | ||
generative_agents | ||
graph_transformers | ||
llm_bash | ||
llm_symbolic_math | ||
llms | ||
open_clip | ||
openai_assistant | ||
pal_chain | ||
plan_and_execute | ||
prompt_injection_identifier | ||
prompts | ||
pydantic_v1 | ||
retrievers | ||
rl_chain | ||
smart_llm | ||
sql | ||
synthetic_data | ||
tabular_synthetic_data | ||
tools | ||
tot | ||
utilities | ||
__init__.py | ||
py.typed |