mirror of
https://github.com/hwchase17/langchain
synced 2024-10-31 15:20:26 +00:00
105 lines
3.2 KiB
Plaintext
105 lines
3.2 KiB
Plaintext
|
# Log10
|
||
|
|
||
|
This page covers how to use the [Log10](https://log10.io) within LangChain.
|
||
|
|
||
|
## What is Log10?
|
||
|
|
||
|
Log10 is an [open source](https://github.com/log10-io/log10) proxiless LLM data management and application development platform that lets you log, debug and tag your Langchain calls.
|
||
|
|
||
|
## Quick start
|
||
|
|
||
|
1. Create your free account at [log10.io](https://log10.io)
|
||
|
2. Add your `LOG10_TOKEN` and `LOG10_ORG_ID` from the Settings and Organization tabs respectively as environment variables.
|
||
|
3. Also add `LOG10_URL=https://log10.io` and your usual LLM API key: for e.g. `OPENAI_API_KEY` or `ANTHROPIC_API_KEY` to your environment
|
||
|
|
||
|
## How to enable Log10 data management for Langchain
|
||
|
|
||
|
Integration with log10 is a simple one-line `log10_callback` integration as shown below:
|
||
|
|
||
|
```python
|
||
|
from langchain.chat_models import ChatOpenAI
|
||
|
from langchain.schema import HumanMessage
|
||
|
|
||
|
from log10.langchain import Log10Callback
|
||
|
from log10.llm import Log10Config
|
||
|
|
||
|
log10_callback = Log10Callback(log10_config=Log10Config())
|
||
|
|
||
|
messages = [
|
||
|
HumanMessage(content="You are a ping pong machine"),
|
||
|
HumanMessage(content="Ping?"),
|
||
|
]
|
||
|
|
||
|
llm = ChatOpenAI(model_name="gpt-3.5-turbo", callbacks=[log10_callback])
|
||
|
```
|
||
|
|
||
|
[Log10 + Langchain + Logs docs](https://github.com/log10-io/log10/blob/main/logging.md#langchain-logger)
|
||
|
|
||
|
[More details + screenshots](https://log10.io/docs/logs) including instructions for self-hosting logs
|
||
|
|
||
|
## How to use tags with Log10
|
||
|
|
||
|
```python
|
||
|
from langchain import OpenAI
|
||
|
from langchain.chat_models import ChatAnthropic
|
||
|
from langchain.chat_models import ChatOpenAI
|
||
|
from langchain.schema import HumanMessage
|
||
|
|
||
|
from log10.langchain import Log10Callback
|
||
|
from log10.llm import Log10Config
|
||
|
|
||
|
log10_callback = Log10Callback(log10_config=Log10Config())
|
||
|
|
||
|
messages = [
|
||
|
HumanMessage(content="You are a ping pong machine"),
|
||
|
HumanMessage(content="Ping?"),
|
||
|
]
|
||
|
|
||
|
llm = ChatOpenAI(model_name="gpt-3.5-turbo", callbacks=[log10_callback], temperature=0.5, tags=["test"])
|
||
|
completion = llm.predict_messages(messages, tags=["foobar"])
|
||
|
print(completion)
|
||
|
|
||
|
llm = ChatAnthropic(model="claude-2", callbacks=[log10_callback], temperature=0.7, tags=["baz"])
|
||
|
llm.predict_messages(messages)
|
||
|
print(completion)
|
||
|
|
||
|
llm = OpenAI(model_name="text-davinci-003", callbacks=[log10_callback], temperature=0.5)
|
||
|
completion = llm.predict("You are a ping pong machine.\nPing?\n")
|
||
|
print(completion)
|
||
|
```
|
||
|
|
||
|
You can also intermix direct OpenAI calls and Langchain LLM calls:
|
||
|
|
||
|
```python
|
||
|
import os
|
||
|
from log10.load import log10, log10_session
|
||
|
import openai
|
||
|
from langchain import OpenAI
|
||
|
|
||
|
log10(openai)
|
||
|
|
||
|
with log10_session(tags=["foo", "bar"]):
|
||
|
# Log a direct OpenAI call
|
||
|
response = openai.Completion.create(
|
||
|
model="text-ada-001",
|
||
|
prompt="Where is the Eiffel Tower?",
|
||
|
temperature=0,
|
||
|
max_tokens=1024,
|
||
|
top_p=1,
|
||
|
frequency_penalty=0,
|
||
|
presence_penalty=0,
|
||
|
)
|
||
|
print(response)
|
||
|
|
||
|
# Log a call via Langchain
|
||
|
llm = OpenAI(model_name="text-ada-001", temperature=0.5)
|
||
|
response = llm.predict("You are a ping pong machine.\nPing?\n")
|
||
|
print(response)
|
||
|
```
|
||
|
|
||
|
## How to debug Langchain calls
|
||
|
|
||
|
[Example of debugging](https://log10.io/docs/prompt_chain_debugging)
|
||
|
|
||
|
[More Langchain examples](https://github.com/log10-io/log10/tree/main/examples#langchain)
|