# Custom LLM Agent (with a ChatModel) This notebook goes through how to create your own custom agent based on a chat model. An LLM chat agent consists of three parts: - PromptTemplate: This is the prompt template that can be used to instruct the language model on what to do - ChatModel: This is the language model that powers the agent - `stop` sequence: Instructs the LLM to stop generating as soon as this string is found - OutputParser: This determines how to parse the LLMOutput into an AgentAction or AgentFinish object import Example from "@snippets/modules/agents/how_to/custom_llm_chat_agent.mdx"