diff --git a/docs/docs/guides/local_llms.ipynb b/docs/docs/guides/local_llms.ipynb index 02011fd8cb..06e087b4f9 100644 --- a/docs/docs/guides/local_llms.ipynb +++ b/docs/docs/guides/local_llms.ipynb @@ -69,7 +69,7 @@ "\n", "[`Ollama`](https://ollama.ai/) is one way to easily run inference on macOS.\n", " \n", - "The instructions [here](docs/integrations/llms/ollama) provide details, which we summarize:\n", + "The instructions [here](https://github.com/jmorganca/ollama?tab=readme-ov-file#ollama) provide details, which we summarize:\n", " \n", "* [Download and run](https://ollama.ai/download) the app\n", "* From command line, fetch a model from this [list of options](https://github.com/jmorganca/ollama): e.g., `ollama pull llama2`\n", @@ -197,10 +197,10 @@ "\n", "### Ollama\n", "\n", - "With [Ollama](docs/integrations/llms/ollama), fetch a model via `ollama pull :`:\n", + "With [Ollama](https://github.com/jmorganca/ollama), fetch a model via `ollama pull :`:\n", "\n", "* E.g., for Llama-7b: `ollama pull llama2` will download the most basic version of the model (e.g., smallest # parameters and 4 bit quantization)\n", - "* We can also specify a particular version from the [model list](https://github.com/jmorganca/ollama), e.g., `ollama pull llama2:13b`\n", + "* We can also specify a particular version from the [model list](https://github.com/jmorganca/ollama?tab=readme-ov-file#model-library), e.g., `ollama pull llama2:13b`\n", "* See the full set of parameters on the [API reference page](https://api.python.langchain.com/en/latest/llms/langchain.llms.ollama.Ollama.html)" ] }, @@ -608,7 +608,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.10.1" + "version": "3.10.12" } }, "nbformat": 4, diff --git a/docs/docs/integrations/chat/ollama.ipynb b/docs/docs/integrations/chat/ollama.ipynb index 7f069112e6..9c7db896d2 100644 --- a/docs/docs/integrations/chat/ollama.ipynb +++ b/docs/docs/integrations/chat/ollama.ipynb @@ -533,7 +533,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.9.16" + "version": "3.10.12" } }, "nbformat": 4, diff --git a/docs/docs/integrations/llms/ollama.ipynb b/docs/docs/integrations/llms/ollama.ipynb index aadc992f2a..969f99ff52 100644 --- a/docs/docs/integrations/llms/ollama.ipynb +++ b/docs/docs/integrations/llms/ollama.ipynb @@ -440,7 +440,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.9.16" + "version": "3.10.12" } }, "nbformat": 4, diff --git a/docs/docs/integrations/providers/ollama.mdx b/docs/docs/integrations/providers/ollama.mdx new file mode 100644 index 0000000000..da10174a16 --- /dev/null +++ b/docs/docs/integrations/providers/ollama.mdx @@ -0,0 +1,55 @@ +# Ollama + +>[Ollama](https://ollama.ai/) is a python library. It allows you to run open-source large language models, +> such as LLaMA2, locally. +> +>`Ollama` bundles model weights, configuration, and data into a single package, defined by a Modelfile. +>It optimizes setup and configuration details, including GPU usage. +>For a complete list of supported models and model variants, see the [Ollama model library](https://ollama.ai/library). + +See [this guide](https://python.langchain.com/docs/guides/local_llms#quickstart) for more details +on how to use `Ollama` with LangChain. + +## Installation and Setup + +Follow [these instructions](https://github.com/jmorganca/ollama?tab=readme-ov-file#ollama) +to set up and run a local Ollama instance. +To use, you should set up the environment variables `ANYSCALE_API_BASE` and +`ANYSCALE_API_KEY`. + + +## LLM + +```python +from langchain.llms import Ollama +``` + +See the notebook example [here](/docs/integrations/llms/ollama). + +## Chat Models + +### Chat Ollama + +```python +from langchain.chat_models import ChatOllama +``` + +See the notebook example [here](/docs/integrations/chat/ollama). + +### Ollama functions + +```python +from langchain_experimental.llms.ollama_functions import OllamaFunctions +``` + +See the notebook example [here](/docs/integrations/chat/ollama_functions). + +## Embedding models + +```python +from langchain.embeddings import OllamaEmbeddings +``` + +See the notebook example [here](/docs/integrations/text_embedding/ollama). + + diff --git a/docs/docs/integrations/text_embedding/ollama.ipynb b/docs/docs/integrations/text_embedding/ollama.ipynb index eee011fc73..7481214f17 100644 --- a/docs/docs/integrations/text_embedding/ollama.ipynb +++ b/docs/docs/integrations/text_embedding/ollama.ipynb @@ -215,7 +215,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.11.5" + "version": "3.10.12" }, "vscode": { "interpreter": {