From a05230a4ba4dee591d3810440ce65e16860956ae Mon Sep 17 00:00:00 2001 From: Leonid Ganeline Date: Thu, 7 Dec 2023 15:48:10 -0800 Subject: [PATCH] docs[patch]: `promptlayer` pages update (#14416) Updated provider page by adding LLM and ChatLLM references; removed a content that is duplicate text from the LLM referenced page. Updated the collback page --- .../integrations/callbacks/promptlayer.ipynb | 9 +-- .../integrations/providers/promptlayer.mdx | 58 +++++++++---------- 2 files changed, 34 insertions(+), 33 deletions(-) diff --git a/docs/docs/integrations/callbacks/promptlayer.ipynb b/docs/docs/integrations/callbacks/promptlayer.ipynb index 6493839666..229b387486 100644 --- a/docs/docs/integrations/callbacks/promptlayer.ipynb +++ b/docs/docs/integrations/callbacks/promptlayer.ipynb @@ -7,12 +7,13 @@ "source": [ "# PromptLayer\n", "\n", + ">[PromptLayer](https://docs.promptlayer.com/introduction) is a platform for prompt engineering. It also helps with the LLM observability to visualize requests, version prompts, and track usage.\n", + ">\n", + ">While `PromptLayer` does have LLMs that integrate directly with LangChain (e.g. [`PromptLayerOpenAI`](https://python.langchain.com/docs/integrations/llms/promptlayer_openai)), using a callback is the recommended way to integrate `PromptLayer` with LangChain.\n", "\n", - ">[PromptLayer](https://promptlayer.com) is a an LLM observability platform that lets you visualize requests, version prompts, and track usage. In this guide we will go over how to setup the `PromptLayerCallbackHandler`. \n", + "In this guide, we will go over how to setup the `PromptLayerCallbackHandler`. \n", "\n", - "While `PromptLayer` does have LLMs that integrate directly with LangChain (e.g. [`PromptLayerOpenAI`](https://python.langchain.com/docs/integrations/llms/promptlayer_openai)), this callback is the recommended way to integrate PromptLayer with LangChain.\n", - "\n", - "See [our docs](https://docs.promptlayer.com/languages/langchain) for more information." + "See [PromptLayer docs](https://docs.promptlayer.com/languages/langchain) for more information." ] }, { diff --git a/docs/docs/integrations/providers/promptlayer.mdx b/docs/docs/integrations/providers/promptlayer.mdx index 3f2a74ffc0..44c724d550 100644 --- a/docs/docs/integrations/providers/promptlayer.mdx +++ b/docs/docs/integrations/providers/promptlayer.mdx @@ -1,49 +1,49 @@ # PromptLayer -This page covers how to use [PromptLayer](https://www.promptlayer.com) within LangChain. -It is broken into two parts: installation and setup, and then references to specific PromptLayer wrappers. +>[PromptLayer](https://docs.promptlayer.com/introduction) is a platform for prompt engineering. +> It also helps with the LLM observability to visualize requests, version prompts, and track usage. +> +>While `PromptLayer` does have LLMs that integrate directly with LangChain (e.g. +> [`PromptLayerOpenAI`](https://docs.promptlayer.com/languages/langchain)), +> using a callback is the recommended way to integrate `PromptLayer` with LangChain. ## Installation and Setup -If you want to work with PromptLayer: -- Install the promptlayer python library `pip install promptlayer` -- Create a PromptLayer account +To work with `PromptLayer`, we have to: +- Create a `PromptLayer` account - Create an api token and set it as an environment variable (`PROMPTLAYER_API_KEY`) -## Wrappers +Install a Python package: -### LLM - -There exists an PromptLayer OpenAI LLM wrapper, which you can access with -```python -from langchain.llms import PromptLayerOpenAI +```bash +pip install promptlayer ``` -To tag your requests, use the argument `pl_tags` when initializing the LLM + +## Callback + +See a [usage example](/docs/integrations/callbacks/promptlayer). + ```python -from langchain.llms import PromptLayerOpenAI -llm = PromptLayerOpenAI(pl_tags=["langchain-requests", "chatbot"]) +import promptlayer # Don't forget this import! +from langchain.callbacks import PromptLayerCallbackHandler ``` -To get the PromptLayer request id, use the argument `return_pl_id` when initializing the LLM + +## LLM + +See a [usage example](/docs/integrations/llms/promptlayer_openai). + ```python from langchain.llms import PromptLayerOpenAI -llm = PromptLayerOpenAI(return_pl_id=True) ``` -This will add the PromptLayer request ID in the `generation_info` field of the `Generation` returned when using `.generate` or `.agenerate` -For example: -```python -llm_results = llm.generate(["hello world"]) -for res in llm_results.generations: - print("pl request id: ", res[0].generation_info["pl_request_id"]) -``` -You can use the PromptLayer request ID to add a prompt, score, or other metadata to your request. [Read more about it here](https://magniv.notion.site/Track-4deee1b1f7a34c1680d085f82567dab9). -This LLM is identical to the [OpenAI](/docs/ecosystem/integrations/openai) LLM, except that -- all your requests will be logged to your PromptLayer account -- you can add `pl_tags` when instantiating to tag your requests on PromptLayer -- you can add `return_pl_id` when instantiating to return a PromptLayer request id to use [while tracking requests](https://magniv.notion.site/Track-4deee1b1f7a34c1680d085f82567dab9). +## Chat Models +See a [usage example](/docs/integrations/chat/promptlayer_chatopenai). + +```python +from langchain.chat_models import PromptLayerChatOpenAI +``` -PromptLayer also provides native wrappers for [`PromptLayerChatOpenAI`](/docs/integrations/chat/promptlayer_chatopenai) and `PromptLayerOpenAIChat`