langchain/docs/modules/llms
Jonathan Pedoeem 606605925d
Adding ability to return_pl_id to all PromptLayer Models in LangChain (#1699)
PromptLayer now has support for [several different tracking
features.](https://magniv.notion.site/Track-4deee1b1f7a34c1680d085f82567dab9)
In order to use any of these features you need to have a request id
associated with the request.

In this PR we add a boolean argument called `return_pl_id` which will
add `pl_request_id` to the `generation_info` dictionary associated with
a generation.

We also updated the relevant documentation.
2023-03-16 17:05:23 -07:00
..
examples Add Qdrant named arguments (#1386) 2023-03-02 07:05:14 -08:00
integrations Adding ability to return_pl_id to all PromptLayer Models in LangChain (#1699) 2023-03-16 17:05:23 -07:00
async_llm.ipynb (rfc) chat models (#1424) 2023-03-06 08:34:24 -08:00
generic_how_to.rst Harrison/fake llm (#990) 2023-02-11 15:12:35 -08:00
getting_started.ipynb Enable streaming for OpenAI LLM (#986) 2023-02-14 15:06:14 -08:00
how_to_guides.rst docs: add missing links to toc (#1163) 2023-02-19 21:15:11 -08:00
integrations.rst Minor grammatical fixes (#1325) 2023-03-01 21:18:09 -08:00
key_concepts.md Minor grammatical fixes (#1325) 2023-03-01 21:18:09 -08:00
streaming_llm.ipynb (rfc) chat models (#1424) 2023-03-06 08:34:24 -08:00