langchain/libs/partners/ollama
Jerron Lim df37c0d086
partners[ollama]: Support base_url for ChatOllama (#24719)
Add a class attribute `base_url` for ChatOllama to allow users to choose
a different URL to connect to.

Fixes #24555
2024-07-28 14:25:58 -04:00
..
langchain_ollama partners[ollama]: Support base_url for ChatOllama (#24719) 2024-07-28 14:25:58 -04:00
scripts ollama: init package (#23615) 2024-07-20 00:43:29 +00:00
tests make image inputs compatible with langchain_ollama (#24619) 2024-07-26 17:39:57 -07:00
.gitignore ollama: init package (#23615) 2024-07-20 00:43:29 +00:00
LICENSE ollama: init package (#23615) 2024-07-20 00:43:29 +00:00
Makefile ollama: release 0.1.0 (#24510) 2024-07-22 10:35:26 -07:00
poetry.lock ollama: release 0.1.0 (#24510) 2024-07-22 10:35:26 -07:00
pyproject.toml all: add release notes to pypi (#24519) 2024-07-22 13:59:13 -07:00
README.md ollama: init package (#23615) 2024-07-20 00:43:29 +00:00

langchain-ollama

This package contains the LangChain integration with Ollama

Installation

pip install -U langchain-ollama

You will also need to run the Ollama server locally. You can download it here.

Chat Models

ChatOllama class exposes chat models from Ollama.

from langchain_ollama import ChatOllama

llm = ChatOllama(model="llama3-groq-tool-use")
llm.invoke("Sing a ballad of LangChain.")

Embeddings

OllamaEmbeddings class exposes embeddings from Ollama.

from langchain_ollama import OllamaEmbeddings

embeddings = OllamaEmbeddings(model="llama3")
embeddings.embed_query("What is the meaning of life?")

LLMs

OllamaLLM class exposes LLMs from Ollama.

from langchain_ollama import OllamaLLM

llm = OllamaLLM(model="llama3")
llm.invoke("The meaning of life is")