2023-02-14 06:06:25 +00:00
# PromptLayer
2023-06-16 18:52:56 +00:00
This page covers how to use [PromptLayer](https://www.promptlayer.com) within LangChain.
It is broken into two parts: installation and setup, and then references to specific PromptLayer wrappers.
2023-02-14 06:06:25 +00:00
## Installation and Setup
2023-06-16 18:52:56 +00:00
If you want to work with PromptLayer:
- Install the promptlayer python library `pip install promptlayer`
2023-02-14 06:06:25 +00:00
- Create a PromptLayer account
- Create an api token and set it as an environment variable (`PROMPTLAYER_API_KEY`)
2023-06-16 18:52:56 +00:00
## Wrappers
2023-02-14 06:06:25 +00:00
2023-06-16 18:52:56 +00:00
### LLM
2023-02-14 06:06:25 +00:00
2023-06-16 18:52:56 +00:00
There exists an PromptLayer OpenAI LLM wrapper, which you can access with
2023-02-14 06:06:25 +00:00
```python
from langchain.llms import PromptLayerOpenAI
```
2023-06-16 18:52:56 +00:00
To tag your requests, use the argument `pl_tags` when instanializing the LLM
2023-02-14 06:06:25 +00:00
```python
from langchain.llms import PromptLayerOpenAI
llm = PromptLayerOpenAI(pl_tags=["langchain-requests", "chatbot"])
```
2023-06-16 18:52:56 +00:00
To get the PromptLayer request id, use the argument `return_pl_id` when instanializing the LLM
2023-03-17 00:05:23 +00:00
```python
from langchain.llms import PromptLayerOpenAI
llm = PromptLayerOpenAI(return_pl_id=True)
```
This will add the PromptLayer request ID in the `generation_info` field of the `Generation` returned when using `.generate` or `.agenerate`
For example:
```python
llm_results = llm.generate(["hello world"])
for res in llm_results.generations:
print("pl request id: ", res[0].generation_info["pl_request_id"])
```
You can use the PromptLayer request ID to add a prompt, score, or other metadata to your request. [Read more about it here](https://magniv.notion.site/Track-4deee1b1f7a34c1680d085f82567dab9).
2023-06-20 21:06:50 +00:00
This LLM is identical to the [OpenAI](/docs/ecosystem/integrations/openai.html) LLM, except that
2023-02-14 06:06:25 +00:00
- all your requests will be logged to your PromptLayer account
2023-06-16 18:52:56 +00:00
- you can add `pl_tags` when instantializing to tag your requests on PromptLayer
- you can add `return_pl_id` when instantializing to return a PromptLayer request id to use [while tracking requests](https://magniv.notion.site/Track-4deee1b1f7a34c1680d085f82567dab9).
2023-02-14 06:06:25 +00:00
2023-03-09 05:24:27 +00:00
2023-07-25 04:20:32 +00:00
PromptLayer also provides native wrappers for [`PromptLayerChatOpenAI`](/docs/integrations/chat/promptlayer_chatopenai.html) and `PromptLayerOpenAIChat`