langchain/docs/ecosystem/baseten.md
Philip Kiely - Baseten a09a0e3511
Baseten integration (#5862)
This PR adds a Baseten integration. I've done my best to follow the
contributor's guidelines and add docs, an example notebook, and an
integration test modeled after similar integrations' test.

Please let me know if there is anything I can do to improve the PR. When
it is merged, please tag https://twitter.com/basetenco and
https://twitter.com/philip_kiely as contributors (the note on the PR
template said to include Twitter accounts)
2023-06-08 23:05:57 -07:00

1.1 KiB

Baseten

Learn how to use LangChain with models deployed on Baseten.

Installation and setup

  • Create a Baseten account and API key.
  • Install the Baseten Python client with pip install baseten
  • Use your API key to authenticate with baseten login

Invoking a model

Baseten integrates with LangChain through the LLM module, which provides a standardized and interoperable interface for models that are deployed on your Baseten workspace.

You can deploy foundation models like WizardLM and Alpaca with one click from the Baseten model library or if you have your own model, deploy it with this tutorial.

In this example, we'll work with WizardLM. Deploy WizardLM here and follow along with the deployed model's version ID.

from langchain.llms import Baseten

wizardlm = Baseten(model="MODEL_VERSION_ID", verbose=True)

wizardlm("What is the difference between a Wizard and a Sorcerer?")