mirror of
https://github.com/hwchase17/langchain
synced 2024-11-10 01:10:59 +00:00
a1614b88ac
# Proxy Fix for Groq Class 🐛 🚀 ## Description This PR fixes a bug related to proxy settings in the `Groq` class, allowing users to connect to LangChain services via a proxy. ## Changes Made - ✅ FIX support for specifying proxy settings in the `Groq` class. - ✅ Resolved the bug causing issues with proxy settings. - ❌ Did not include unit tests and documentation updates. - ❌ Did not run make format, make lint, and make test to ensure code quality and functionality because I couldn't get it to run, so I don't program in Python and couldn't run `ruff`. - ❔ Ensured that the changes are backwards compatible. - ✅ No additional dependencies were added to `pyproject.toml`. ### Error Before Fix ```python Traceback (most recent call last): File "/home/bg/Documents/code/github.com/back2nix/test/groq/main.py", line 9, in <module> chat = ChatGroq( ^^^^^^^^^ File "/home/bg/Documents/code/github.com/back2nix/test/groq/venv310/lib/python3.11/site-packages/langchain_core/load/serializable.py", line 120, in __init__ super().__init__(**kwargs) File "/home/bg/Documents/code/github.com/back2nix/test/groq/venv310/lib/python3.11/site-packages/pydantic/v1/main.py", line 341, in __init__ raise validation_error pydantic.v1.error_wrappers.ValidationError: 1 validation error for ChatGroq __root__ Invalid `http_client` argument; Expected an instance of `httpx.AsyncClient` but got <class 'httpx.Client'> (type=type_error) ``` ### Example usage after fix ```python3 import os import httpx from langchain_core.prompts import ChatPromptTemplate from langchain_groq import ChatGroq chat = ChatGroq( temperature=0, groq_api_key=os.environ.get("GROQ_API_KEY"), model_name="mixtral-8x7b-32768", http_client=httpx.Client( proxies="socks5://127.0.0.1:1080", transport=httpx.HTTPTransport(local_address="0.0.0.0"), ), http_async_client=httpx.AsyncClient( proxies="socks5://127.0.0.1:1080", transport=httpx.HTTPTransport(local_address="0.0.0.0"), ), ) system = "You are a helpful assistant." human = "{text}" prompt = ChatPromptTemplate.from_messages([("system", system), ("human", human)]) chain = prompt | chat out = chain.invoke({"text": "Explain the importance of low latency LLMs"}) print(out) ``` --------- Co-authored-by: Bagatur <22008038+baskaryan@users.noreply.github.com> Co-authored-by: Bagatur <baskaryan@gmail.com> |
||
---|---|---|
.. | ||
__init__.py | ||
chat_models.py | ||
py.typed |