langchain/libs/partners/together
ccurme 22da9f5f3f
update scheduled tests (#20526)
repurpose scheduled tests to test over provider packages
2024-04-16 16:49:46 -04:00
..
langchain_together together[minor]: Update endpoint to non deprecated version (#19649) 2024-03-31 21:21:46 +00:00
scripts infra: add print rule to ruff (#16221) 2024-02-09 16:13:30 -08:00
tests infra: add print rule to ruff (#16221) 2024-02-09 16:13:30 -08:00
.gitignore
LICENSE
Makefile update scheduled tests (#20526) 2024-04-16 16:49:46 -04:00
poetry.lock partners: version constraints (#17492) 2024-02-14 08:57:46 -08:00
pyproject.toml together: release 0.1.0 (#20225) 2024-04-09 12:23:52 -07:00
README.md docs: update Together README.md (#18004) 2024-03-29 00:02:32 +00:00

langchain-together

This package contains the LangChain integration for Together's generative models.

Installation

pip install -U langchain-together

Embeddings

You can use Together's embedding models through TogetherEmbeddings class.

from langchain_together import TogetherEmbeddings

embeddings = TogetherEmbeddings(
    model='togethercomputer/m2-bert-80M-8k-retrieval'
)
embeddings.embed_query("What is a large language model?")

LLMs

You can use Together's generative AI models as Langchain LLMs:

from langchain_together import Together
from langchain_core.prompts import PromptTemplate

llm = Together(
    model="togethercomputer/RedPajama-INCITE-7B-Base",
    temperature=0.7,
    max_tokens=64,
    top_k=1,
    # together_api_key="..."
)

template = """Question: {question}
Answer: """
prompt = PromptTemplate.from_template(template)

chain = prompt | llm

question = "Who was the president in the year Justin Beiber was born?"
print(chain.invoke({"question": question}))