mirror of
https://github.com/hwchase17/langchain
synced 2024-10-29 17:07:25 +00:00
da69a6771f
Documentation update for [Jina ecosystem](https://python.langchain.com/docs/ecosystem/integrations/jina) and `langchain-serve` in the deployments section to latest features. @hwchase17 <!-- Thank you for contributing to LangChain! Replace this comment with: - Description: a description of the change, - Issue: the issue # it fixes (if applicable), - Dependencies: any dependencies required for this change, - Tag maintainer: for a quicker response, tag the relevant maintainer (see below), - Twitter handle: we announce bigger features on Twitter. If your PR gets announced and you'd like a mention, we'll gladly shout you out! If you're adding a new integration, please include: 1. a test for the integration, preferably unit tests that do not rely on network access, 2. an example notebook showing its use. Maintainer responsibilities: - General / Misc / if you don't know who to tag: @baskaryan - DataLoaders / VectorStores / Retrievers: @rlancemartin, @eyurtsev - Models / Prompts: @hwchase17, @baskaryan - Memory: @hwchase17 - Agents / Tools / Toolkits: @hinthornw - Tracing / Callbacks: @agola11 - Async: @agola11 If no one reviews your PR within a few days, feel free to @-mention the same people again. See contribution guidelines for more information on how to write/run tests, lint, etc: https://github.com/hwchase17/langchain/blob/master/.github/CONTRIBUTING.md -->
75 lines
2.3 KiB
Plaintext
75 lines
2.3 KiB
Plaintext
# Jina
|
|
|
|
This page covers how to use the Jina ecosystem within LangChain.
|
|
It is broken into two parts: installation and setup, and then references to specific Jina wrappers.
|
|
|
|
## Installation and Setup
|
|
- Install the Python SDK with `pip install jina`
|
|
- Get a Jina AI Cloud auth token from [here](https://cloud.jina.ai/settings/tokens) and set it as an environment variable (`JINA_AUTH_TOKEN`)
|
|
|
|
## Wrappers
|
|
|
|
### Embeddings
|
|
|
|
There exists a Jina Embeddings wrapper, which you can access with
|
|
```python
|
|
from langchain.embeddings import JinaEmbeddings
|
|
```
|
|
For a more detailed walkthrough of this, see [this notebook](/docs/modules/data_connection/text_embedding/integrations/jina.html)
|
|
|
|
## Deployment
|
|
|
|
[Langchain-serve](https://github.com/jina-ai/langchain-serve), powered by Jina, helps take LangChain apps to production with easy to use REST/WebSocket APIs and Slack bots.
|
|
|
|
### Usage
|
|
|
|
Install the package from PyPI.
|
|
|
|
```bash
|
|
pip install langchain-serve
|
|
```
|
|
|
|
Wrap your LangChain app with the `@serving` decorator.
|
|
|
|
```python
|
|
# app.py
|
|
from lcserve import serving
|
|
|
|
@serving
|
|
def ask(input: str) -> str:
|
|
from langchain import LLMChain, OpenAI
|
|
from langchain.agents import AgentExecutor, ZeroShotAgent
|
|
|
|
tools = [...] # list of tools
|
|
prompt = ZeroShotAgent.create_prompt(
|
|
tools, input_variables=["input", "agent_scratchpad"],
|
|
)
|
|
llm_chain = LLMChain(llm=OpenAI(temperature=0), prompt=prompt)
|
|
agent = ZeroShotAgent(
|
|
llm_chain=llm_chain, allowed_tools=[tool.name for tool in tools]
|
|
)
|
|
agent_executor = AgentExecutor.from_agent_and_tools(
|
|
agent=agent,
|
|
tools=tools,
|
|
verbose=True,
|
|
)
|
|
return agent_executor.run(input)
|
|
```
|
|
|
|
Deploy on Jina AI Cloud with `lc-serve deploy jcloud app`. Once deployed, we can send a POST request to the API endpoint to get a response.
|
|
|
|
```bash
|
|
curl -X 'POST' 'https://<your-app>.wolf.jina.ai/ask' \
|
|
-d '{
|
|
"input": "Your Quesion here?",
|
|
"envs": {
|
|
"OPENAI_API_KEY": "sk-***"
|
|
}
|
|
}'
|
|
```
|
|
|
|
You can also self-host the app on your infrastructure with Docker-compose or Kubernetes. See [here](https://github.com/jina-ai/langchain-serve#-self-host-llm-apps-with-docker-compose-or-kubernetes) for more details.
|
|
|
|
|
|
Langchain-serve also allows to deploy the apps with WebSocket APIs and Slack Bots both on [Jina AI Cloud](https://cloud.jina.ai/) or self-hosted infrastructure.
|