langchain/docs/deployments.md
Ismail Pelaseyed b21fe0a18f
Add example on deploying LangChain to Cloud Run (#3366)
## Summary

Adds a link to a minimal example of running LangChain on Google Cloud
Run.
2023-04-22 20:09:00 -07:00

3.2 KiB

Deployments

So you've made a really cool chain - now what? How do you deploy it and make it easily sharable with the world?

This section covers several options for that. Note that these are meant as quick deployment options for prototypes and demos, and not for production systems. If you are looking for help with deployment of a production system, please contact us directly.

What follows is a list of template GitHub repositories aimed that are intended to be very easy to fork and modify to use your chain. This is far from an exhaustive list of options, and we are EXTREMELY open to contributions here.

Streamlit

This repo serves as a template for how to deploy a LangChain with Streamlit. It implements a chatbot interface. It also contains instructions for how to deploy this app on the Streamlit platform.

Gradio (on Hugging Face)

This repo serves as a template for how deploy a LangChain with Gradio. It implements a chatbot interface, with a "Bring-Your-Own-Token" approach (nice for not wracking up big bills). It also contains instructions for how to deploy this app on the Hugging Face platform. This is heavily influenced by James Weaver's excellent examples.

Beam

This repo serves as a template for how deploy a LangChain with Beam.

It implements a Question Answering app and contains instructions for deploying the app as a serverless REST API.

Vercel

A minimal example on how to run LangChain on Vercel using Flask.

Digitalocean App Platform

A minimal example on how to deploy LangChain to DigitalOcean App Platform.

Google Cloud Run

A minimal example on how to deploy LangChain to Google Cloud Run.

SteamShip

This repository contains LangChain adapters for Steamship, enabling LangChain developers to rapidly deploy their apps on Steamship. This includes: production ready endpoints, horizontal scaling across dependencies, persistant storage of app state, multi-tenancy support, etc.

Langchain-serve

This repository allows users to serve local chains and agents as RESTful, gRPC, or Websocket APIs thanks to Jina. Deploy your chains & agents with ease and enjoy independent scaling, serverless and autoscaling APIs, as well as a Streamlit playground on Jina AI Cloud.

BentoML

This repository provides an example of how to deploy a LangChain application with BentoML. BentoML is a framework that enables the containerization of machine learning applications as standard OCI images. BentoML also allows for the automatic generation of OpenAPI and gRPC endpoints. With BentoML, you can integrate models from all popular ML frameworks and deploy them as microservices running on the most optimal hardware and scaling independently.