2023-04-04 13:49:17 +00:00
# GPT4All
2023-04-10 00:54:26 +00:00
This page covers how to use the `GPT4All` wrapper within LangChain. The tutorial is divided into two parts: installation and setup, followed by usage with an example.
2023-04-04 13:49:17 +00:00
## Installation and Setup
2023-04-30 18:14:09 +00:00
2023-04-04 13:49:17 +00:00
- Install the Python package with `pip install pyllamacpp`
2023-04-10 00:54:26 +00:00
- Download a [GPT4All model](https://github.com/nomic-ai/pyllamacpp#supported-model) and place it in your desired directory
2023-04-04 13:49:17 +00:00
## Usage
### GPT4All
To use the GPT4All wrapper, you need to provide the path to the pre-trained model file and the model's configuration.
2023-04-10 00:54:26 +00:00
2023-04-04 13:49:17 +00:00
```python
from langchain.llms import GPT4All
2023-04-10 00:54:26 +00:00
# Instantiate the model. Callbacks support token-wise streaming
2023-04-04 13:49:17 +00:00
model = GPT4All(model="./models/gpt4all-model.bin", n_ctx=512, n_threads=8)
# Generate text
response = model("Once upon a time, ")
```
You can also customize the generation parameters, such as n_predict, temp, top_p, top_k, and others.
2023-04-10 00:54:26 +00:00
To stream the model's predictions, add in a CallbackManager.
2023-04-04 13:49:17 +00:00
```python
2023-04-10 00:54:26 +00:00
from langchain.llms import GPT4All
from langchain.callbacks.streaming_stdout import StreamingStdOutCallbackHandler
2023-04-30 18:14:09 +00:00
2023-04-10 00:54:26 +00:00
# There are many CallbackHandlers supported, such as
# from langchain.callbacks.streamlit import StreamlitCallbackHandler
2023-04-30 18:14:09 +00:00
callbacks = [StreamingStdOutCallbackHandler()]
model = GPT4All(model="./models/gpt4all-model.bin", n_ctx=512, n_threads=8)
2023-04-10 00:54:26 +00:00
2023-04-14 03:26:26 +00:00
# Generate text. Tokens are streamed through the callback manager.
2023-04-30 18:14:09 +00:00
model("Once upon a time, ", callbacks=callbacks)
2023-04-04 13:49:17 +00:00
```
2023-04-10 00:54:26 +00:00
2023-04-04 13:49:17 +00:00
## Model File
2023-04-10 00:54:26 +00:00
You can find links to model file downloads in the [pyllamacpp](https://github.com/nomic-ai/pyllamacpp) repository.
2023-04-04 13:49:17 +00:00
2023-07-25 04:20:32 +00:00
For a more detailed walkthrough of this, see [this notebook](/docs/integrations/llms/gpt4all.html)