langchain/docs/integrations/promptlayer.md
Leonid Ganeline e2d7677526
docs: compound ecosystem and integrations (#4870)
# Docs: compound ecosystem and integrations

**Problem statement:** We have a big overlap between the
References/Integrations and Ecosystem/LongChain Ecosystem pages. It
confuses users. It creates a situation when new integration is added
only on one of these pages, which creates even more confusion.
- removed References/Integrations page (but move all its information
into the individual integration pages - in the next PR).
- renamed Ecosystem/LongChain Ecosystem into Integrations/Integrations.
I like the Ecosystem term. It is more generic and semantically richer
than the Integration term. But it mentally overloads users. The
`integration` term is more concrete.
UPDATE: after discussion, the Ecosystem is the term.
Ecosystem/Integrations is the page (in place of Ecosystem/LongChain
Ecosystem).

As a result, a user gets a single place to start with the individual
integration.
2023-05-18 09:29:57 -07:00

2.0 KiB

PromptLayer

This page covers how to use PromptLayer within LangChain. It is broken into two parts: installation and setup, and then references to specific PromptLayer wrappers.

Installation and Setup

If you want to work with PromptLayer:

  • Install the promptlayer python library pip install promptlayer
  • Create a PromptLayer account
  • Create an api token and set it as an environment variable (PROMPTLAYER_API_KEY)

Wrappers

LLM

There exists an PromptLayer OpenAI LLM wrapper, which you can access with

from langchain.llms import PromptLayerOpenAI

To tag your requests, use the argument pl_tags when instanializing the LLM

from langchain.llms import PromptLayerOpenAI
llm = PromptLayerOpenAI(pl_tags=["langchain-requests", "chatbot"])

To get the PromptLayer request id, use the argument return_pl_id when instanializing the LLM

from langchain.llms import PromptLayerOpenAI
llm = PromptLayerOpenAI(return_pl_id=True)

This will add the PromptLayer request ID in the generation_info field of the Generation returned when using .generate or .agenerate

For example:

llm_results = llm.generate(["hello world"])
for res in llm_results.generations:
    print("pl request id: ", res[0].generation_info["pl_request_id"])

You can use the PromptLayer request ID to add a prompt, score, or other metadata to your request. Read more about it here.

This LLM is identical to the OpenAI LLM, except that

  • all your requests will be logged to your PromptLayer account
  • you can add pl_tags when instantializing to tag your requests on PromptLayer
  • you can add return_pl_id when instantializing to return a PromptLayer request id to use while tracking requests.

PromptLayer also provides native wrappers for PromptLayerChatOpenAI and PromptLayerOpenAIChat