diff --git a/libs/cli/langchain_cli/integration_template/README.md b/libs/cli/langchain_cli/integration_template/README.md index e1f3e35247..19c73c4a5b 100644 --- a/libs/cli/langchain_cli/integration_template/README.md +++ b/libs/cli/langchain_cli/integration_template/README.md @@ -1 +1,45 @@ # __package_name__ + +This package contains the LangChain integration with __ModuleName__ + +## Installation + +```bash +pip install -U __package_name__ +``` + +And you should configure credentials by setting the following environment variables: + +* TODO: fill this out + +## Chat Models + +`Chat__ModuleName__` class exposes chat models from __ModuleName__. + +```python +from __module_name__ import Chat__ModuleName__ + +llm = Chat__ModuleName__() +llm.invoke("Sing a ballad of LangChain.") +``` + +## Embeddings + +`__ModuleName__Embeddings` class exposes embeddings from __ModuleName__. + +```python +from __module_name__ import __ModuleName__Embeddings + +embeddings = __ModuleName__Embeddings() +embeddings.embed_query("What is the meaning of life?") +``` + +## LLMs +`__ModuleName__LLM` class exposes LLMs from __ModuleName__. + +```python +from __module_name__ import __ModuleName__LLM + +llm = __ModuleName__LLM() +llm.invoke("The meaning of life is") +``` diff --git a/libs/partners/google-vertexai/README.md b/libs/partners/google-vertexai/README.md index 6a4839254f..0637bd7cc9 100644 --- a/libs/partners/google-vertexai/README.md +++ b/libs/partners/google-vertexai/README.md @@ -10,7 +10,7 @@ pip install -U langchain-google-vertexai ## Chat Models -`ChatVertexAI` class exposes models . +`ChatVertexAI` class exposes models such as `gemini-pro` and `chat-bison`. To use, you should have Google Cloud project with APIs enabled, and configured credentials. Initialize the model as: @@ -63,7 +63,7 @@ The value of `image_url` can be any of the following: You can use Google Cloud's embeddings models as: -``` +```python from langchain_google_vertexai import VertexAIEmbeddings embeddings = VertexAIEmbeddings()