langchain/libs/partners/groq
Bagatur 9514bc4d67
core[minor], ...: add tool calls message (#18947)
core[minor], langchain[patch], openai[minor], anthropic[minor], fireworks[minor], groq[minor], mistralai[minor]

```python
class ToolCall(TypedDict):
    name: str
    args: Dict[str, Any]
    id: Optional[str]

class InvalidToolCall(TypedDict):
    name: Optional[str]
    args: Optional[str]
    id: Optional[str]
    error: Optional[str]

class ToolCallChunk(TypedDict):
    name: Optional[str]
    args: Optional[str]
    id: Optional[str]
    index: Optional[int]


class AIMessage(BaseMessage):
    ...
    tool_calls: List[ToolCall] = []
    invalid_tool_calls: List[InvalidToolCall] = []
    ...


class AIMessageChunk(AIMessage, BaseMessageChunk):
    ...
    tool_call_chunks: Optional[List[ToolCallChunk]] = None
    ...
```
Important considerations:
- Parsing logic occurs within different providers;
- ~Changing output type is a breaking change for anyone doing explicit
type checking;~
- ~Langsmith rendering will need to be updated:
https://github.com/langchain-ai/langchainplus/pull/3561~
- ~Langserve will need to be updated~
- Adding chunks:
- ~AIMessage + ToolCallsMessage = ToolCallsMessage if either has
non-null .tool_calls.~
- Tool call chunks are appended, merging when having equal values of
`index`.
  - additional_kwargs accumulate the normal way.
- During streaming:
- ~Messages can change types (e.g., from AIMessageChunk to
AIToolCallsMessageChunk)~
- Output parsers parse additional_kwargs (during .invoke they read off
tool calls).

Packages outside of `partners/`:
- https://github.com/langchain-ai/langchain-cohere/pull/7
- https://github.com/langchain-ai/langchain-google/pull/123/files

---------

Co-authored-by: Chester Curme <chester.curme@gmail.com>
2024-04-09 18:41:42 -05:00
..
langchain_groq core[minor], ...: add tool calls message (#18947) 2024-04-09 18:41:42 -05:00
scripts
tests core[minor], ...: add tool calls message (#18947) 2024-04-09 18:41:42 -05:00
.gitignore
LICENSE
Makefile
poetry.lock groq[patch]: Release 0.1.1 (#20242) 2024-04-09 21:59:58 +00:00
pyproject.toml groq[patch]: Release 0.1.1 (#20242) 2024-04-09 21:59:58 +00:00
README.md

langchain-groq

Welcome to Groq! 🚀

At Groq, we've developed the world's first Language Processing Unit™, or LPU. The Groq LPU has a deterministic, single core streaming architecture that sets the standard for GenAI inference speed with predictable and repeatable performance for any given workload.

Beyond the architecture, our software is designed to empower developers like you with the tools you need to create innovative, powerful AI applications. With Groq as your engine, you can:

  • Achieve uncompromised low latency and performance for real-time AI and HPC inferences 🔥
  • Know the exact performance and compute time for any given workload 🔮
  • Take advantage of our cutting-edge technology to stay ahead of the competition 💪

Want more Groq? Check out our website for more resources and join our Discord community to connect with our developers!

Installation and Setup

Install the integration package:

pip install langchain-groq

Request an API key and set it as an environment variable

export GROQ_API_KEY=gsk_...

Chat Model

See a usage example.

Development

To develop the langchain-groq package, you'll need to follow these instructions:

Install dev dependencies

poetry install --with test,test_integration,lint,codespell

Build the package

poetry build

Run unit tests

Unit tests live in tests/unit_tests and SHOULD NOT require an internet connection or a valid API KEY. Run unit tests with

make tests

Run integration tests

Integration tests live in tests/integration_tests and require a connection to the Groq API and a valid API KEY.

make integration_tests

Lint & Format

Run additional tests and linters to ensure your code is up to standard.

make lint spell_check check_imports