groq[patch]: Make stream robust to ToolMessage (#20417)

```python
from langchain.agents import AgentExecutor, create_tool_calling_agent, tool
from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder
from langchain_groq import ChatGroq


prompt = ChatPromptTemplate.from_messages(
    [
        ("system", "You are a helpful assistant"),
        ("human", "{input}"),
        MessagesPlaceholder("agent_scratchpad"),
    ]
)

model = ChatGroq(model_name="mixtral-8x7b-32768", temperature=0)

@tool
def magic_function(input: int) -> int:
    """Applies a magic function to an input."""
    return input + 2

tools = [magic_function]


agent = create_tool_calling_agent(model, tools, prompt)
agent_executor = AgentExecutor(agent=agent, tools=tools, verbose=True)

agent_executor.invoke({"input": "what is the value of magic_function(3)?"})
```
```
> Entering new AgentExecutor chain...

Invoking: `magic_function` with `{'input': 3}`


5The value of magic\_function(3) is 5.

> Finished chain.
{'input': 'what is the value of magic_function(3)?',
 'output': 'The value of magic\\_function(3) is 5.'}
```
pull/20422/head
ccurme 3 months ago committed by GitHub
parent 6dc4f592ba
commit 4b6b0a87b6
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194

@ -287,7 +287,7 @@ class ChatGroq(BaseChatModel):
"id": rtc.get("id"),
"index": rtc.get("index"),
}
for rtc in message.additional_kwargs["tool_calls"]
for rtc in message.additional_kwargs.get("tool_calls", [])
]
chunk_ = ChatGenerationChunk(
message=AIMessageChunk(
@ -358,7 +358,7 @@ class ChatGroq(BaseChatModel):
"id": rtc.get("id"),
"index": rtc.get("index"),
}
for rtc in message.additional_kwargs["tool_calls"]
for rtc in message.additional_kwargs.get("tool_calls", [])
]
chunk_ = ChatGenerationChunk(
message=AIMessageChunk(

Loading…
Cancel
Save