langchain/libs/partners/together
Yash Mathur c42ec58578
together[minor]: Update endpoint to non deprecated version (#19649)
- **Updating Together.ai Endpoint**: "langchain_together: Updated
Deprecated endpoint for partner package"

- Description: The inference API of together is deprecates, do replaced
with completions and made corresponding changes.
- Twitter handle: @dev_yashmathur

---------

Co-authored-by: Bagatur <22008038+baskaryan@users.noreply.github.com>
Co-authored-by: Bagatur <baskaryan@gmail.com>
2024-03-31 21:21:46 +00:00
..
langchain_together together[minor]: Update endpoint to non deprecated version (#19649) 2024-03-31 21:21:46 +00:00
scripts infra: add print rule to ruff (#16221) 2024-02-09 16:13:30 -08:00
tests infra: add print rule to ruff (#16221) 2024-02-09 16:13:30 -08:00
.gitignore together: package and embedding model (#14936) 2023-12-19 18:48:32 -08:00
LICENSE together: package and embedding model (#14936) 2023-12-19 18:48:32 -08:00
Makefile together: package and embedding model (#14936) 2023-12-19 18:48:32 -08:00
poetry.lock partners: version constraints (#17492) 2024-02-14 08:57:46 -08:00
pyproject.toml partners: version constraints (#17492) 2024-02-14 08:57:46 -08:00
README.md docs: update Together README.md (#18004) 2024-03-29 00:02:32 +00:00

langchain-together

This package contains the LangChain integration for Together's generative models.

Installation

pip install -U langchain-together

Embeddings

You can use Together's embedding models through TogetherEmbeddings class.

from langchain_together import TogetherEmbeddings

embeddings = TogetherEmbeddings(
    model='togethercomputer/m2-bert-80M-8k-retrieval'
)
embeddings.embed_query("What is a large language model?")

LLMs

You can use Together's generative AI models as Langchain LLMs:

from langchain_together import Together
from langchain_core.prompts import PromptTemplate

llm = Together(
    model="togethercomputer/RedPajama-INCITE-7B-Base",
    temperature=0.7,
    max_tokens=64,
    top_k=1,
    # together_api_key="..."
)

template = """Question: {question}
Answer: """
prompt = PromptTemplate.from_template(template)

chain = prompt | llm

question = "Who was the president in the year Justin Beiber was born?"
print(chain.invoke({"question": question}))