Update `gpt4all.mdx` doc (#15392)

The [pyllamacpp](https://github.com/nomic-ai/pyllamacpp) repository has
been archived and the model name and usage need to be changed.
pull/14542/head
Bob Lin 9 months ago committed by GitHub
parent b6c57d38fa
commit 4488234d64
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

@ -4,8 +4,15 @@ This page covers how to use the `GPT4All` wrapper within LangChain. The tutorial
## Installation and Setup
- Install the Python package with `pip install pyllamacpp`
- Download a [GPT4All model](https://github.com/nomic-ai/pyllamacpp#supported-model) and place it in your desired directory
- Install the Python package with `pip install gpt4all`
- Download a [GPT4All model](https://gpt4all.io/index.html) and place it in your desired directory
In this example, We are using `mistral-7b-openorca.Q4_0.gguf`(Best overall fast chat model):
```bash
mkdir models
wget https://gpt4all.io/models/gguf/mistral-7b-openorca.Q4_0.gguf -O models/mistral-7b-openorca.Q4_0.gguf
```
## Usage
@ -17,7 +24,7 @@ To use the GPT4All wrapper, you need to provide the path to the pre-trained mode
from langchain.llms import GPT4All
# Instantiate the model. Callbacks support token-wise streaming
model = GPT4All(model="./models/gpt4all-model.bin", n_ctx=512, n_threads=8)
model = GPT4All(model="./models/mistral-7b-openorca.Q4_0.gguf", n_threads=8)
# Generate text
response = model("Once upon a time, ")
@ -35,7 +42,7 @@ from langchain.callbacks.streaming_stdout import StreamingStdOutCallbackHandler
# from langchain.callbacks.streamlit import StreamlitCallbackHandler
callbacks = [StreamingStdOutCallbackHandler()]
model = GPT4All(model="./models/gpt4all-model.bin", n_ctx=512, n_threads=8)
model = GPT4All(model="./models/mistral-7b-openorca.Q4_0.gguf", n_threads=8)
# Generate text. Tokens are streamed through the callback manager.
model("Once upon a time, ", callbacks=callbacks)
@ -43,6 +50,6 @@ model("Once upon a time, ", callbacks=callbacks)
## Model File
You can find links to model file downloads in the [pyllamacpp](https://github.com/nomic-ai/pyllamacpp) repository.
You can find links to model file downloads in the [https://gpt4all.io/](https://gpt4all.io/index.html).
For a more detailed walkthrough of this, see [this notebook](/docs/integrations/llms/gpt4all)

Loading…
Cancel
Save