langchain/libs/partners/ollama
2024-08-28 12:31:16 -07:00
..
langchain_ollama core, partners: implement standard tracing params for LLMs (#25410) 2024-08-16 13:18:09 -04:00
scripts patch[Partners] Unified fix of incorrect variable declarations in all check_imports (#25014) 2024-08-03 13:49:41 -04:00
tests add embeddings integration tests (#25508) 2024-08-16 13:20:37 -07:00
.gitignore
LICENSE
Makefile all: Improve make lint command (#25344) 2024-08-23 18:23:52 -07:00
poetry.lock ollama: bump core version (#25826) 2024-08-28 12:31:16 -07:00
pyproject.toml ollama: bump core version (#25826) 2024-08-28 12:31:16 -07:00
README.md

langchain-ollama

This package contains the LangChain integration with Ollama

Installation

pip install -U langchain-ollama

You will also need to run the Ollama server locally. You can download it here.

Chat Models

ChatOllama class exposes chat models from Ollama.

from langchain_ollama import ChatOllama

llm = ChatOllama(model="llama3-groq-tool-use")
llm.invoke("Sing a ballad of LangChain.")

Embeddings

OllamaEmbeddings class exposes embeddings from Ollama.

from langchain_ollama import OllamaEmbeddings

embeddings = OllamaEmbeddings(model="llama3")
embeddings.embed_query("What is the meaning of life?")

LLMs

OllamaLLM class exposes LLMs from Ollama.

from langchain_ollama import OllamaLLM

llm = OllamaLLM(model="llama3")
llm.invoke("The meaning of life is")