LLMs & Prompts ============== The examples here all highlight how to work with LLMs and prompts. **LLMs** `LLM Functionality `_: A walkthrough of all the functionality the standard LLM interface exposes. `LLM Serialization `_: A walkthrough of how to serialize LLMs to and from disk. `LLM Caching `_: Covers different types of caches, and how to use a cache to save results of LLM calls. `Custom LLM `_: How to create and use a custom LLM class, in case you have an LLM not from one of the standard providers (including one that you host yourself). **Specific LLM Integrations** `Huggingface Hub `_: Covers how to connect to LLMs hosted on HuggingFace Hub. `Azure OpenAI `_: Covers how to connect to Azure-hosted OpenAI Models. **Prompts** `Prompt Management `_: A walkthrough of all the functionality LangChain supports for working with prompts. `Prompt Serialization `_: A walkthrough of how to serialize prompts to and from disk. `Few Shot Examples `_: How to include examples in the prompt. `Generate Examples `_: How to use existing examples to generate more examples. `Custom Example Selector `_: How to create and use a custom ExampleSelector (the class responsible for choosing which examples to use in a prompt). `Custom Prompt Template `_: How to create and use a custom PromptTemplate, the logic that decides how input variables get formatted into a prompt. .. toctree:: :maxdepth: 1 :glob: :hidden: prompts/*