langchain/libs/partners/google-genai
chyroc 86d27fd684
Fix: fix partners name typo in tests (#15066)
<!-- Thank you for contributing to LangChain!

Please title your PR "<package>: <description>", where <package> is
whichever of langchain, community, core, experimental, etc. is being
modified.

Replace this entire comment with:
  - **Description:** a description of the change, 
  - **Issue:** the issue # it fixes if applicable,
  - **Dependencies:** any dependencies required for this change,
- **Twitter handle:** we announce bigger features on Twitter. If your PR
gets announced, and you'd like a mention, we'll gladly shout you out!

Please make sure your PR is passing linting and testing before
submitting. Run `make format`, `make lint` and `make test` from the root
of the package you've modified to check this locally.

See contribution guidelines for more information on how to write/run
tests, lint, etc: https://python.langchain.com/docs/contributing/

If you're adding a new integration, please include:
1. a test for the integration, preferably unit tests that do not rely on
network access,
2. an example notebook showing its use. It lives in
`docs/docs/integrations` directory.

If no one reviews your PR within a few days, please @-mention one of
@baskaryan, @eyurtsev, @hwchase17.
 -->

---------

Co-authored-by: Harrison Chase <hw.chase.17@gmail.com>
Co-authored-by: Ran <rccalman@gmail.com>
2023-12-22 11:48:39 -08:00
..
langchain_google_genai added history and support for system_message as param (#14824) 2023-12-18 18:23:14 -08:00
scripts [Partner] Add langchain-google-genai package (gemini) (#14621) 2023-12-13 11:57:59 -08:00
tests Fix: fix partners name typo in tests (#15066) 2023-12-22 11:48:39 -08:00
.gitignore [Partner] Add langchain-google-genai package (gemini) (#14621) 2023-12-13 11:57:59 -08:00
LICENSE [Partner] Add langchain-google-genai package (gemini) (#14621) 2023-12-13 11:57:59 -08:00
Makefile [Partner] Add langchain-google-genai package (gemini) (#14621) 2023-12-13 11:57:59 -08:00
poetry.lock google-genai[patch]: add google-genai integration deps and extras (#14731) 2023-12-14 13:20:10 -08:00
pyproject.toml [Partner] Google GenAi new release (#14882) 2023-12-18 18:35:57 -08:00
README.md google-genai[patch]: add google-genai integration deps and extras (#14731) 2023-12-14 13:20:10 -08:00

langchain-google-genai

This package contains the LangChain integrations for Gemini through their generative-ai SDK.

Installation

pip install -U langchain-google-genai

Image utilities

To use image utility methods, like loading images from GCS urls, install with extras group 'images':

pip install -e "langchain-google-genai[images]"

Chat Models

This package contains the ChatGoogleGenerativeAI class, which is the recommended way to interface with the Google Gemini series of models.

To use, install the requirements, and configure your environment.

export GOOGLE_API_KEY=your-api-key

Then initialize

from langchain_google_genai import ChatGoogleGenerativeAI

llm = ChatGoogleGenerativeAI(model="gemini-pro")
llm.invoke("Sing a ballad of LangChain.")

Multimodal inputs

Gemini vision model supports image inputs when providing a single chat message. Example:

from langchain_core.messages import HumanMessage
from langchain_google_genai import ChatGoogleGenerativeAI

llm = ChatGoogleGenerativeAI(model="gemini-pro-vision")
# example
message = HumanMessage(
    content=[
        {
            "type": "text",
            "text": "What's in this image?",
        },  # You can optionally provide text parts
        {"type": "image_url", "image_url": "https://picsum.photos/seed/picsum/200/300"},
    ]
)
llm.invoke([message])

The value of image_url can be any of the following:

  • A public image URL
  • An accessible gcs file (e.g., "gcs://path/to/file.png")
  • A local file path
  • A base64 encoded image (e.g., data:image/png;base64,abcd124)
  • A PIL image

Embeddings

This package also adds support for google's embeddings models.

from langchain_google_genai import GoogleGenerativeAIEmbeddings

embeddings = GoogleGenerativeAIEmbeddings(model="models/embedding-001")
embeddings.embed_query("hello, world!")