2023-03-28 00:20:59 +00:00
< h1 align = "center" > GPT4All< / h1 >
2023-08-17 20:56:28 +00:00
2023-10-24 16:04:19 +00:00
< p align = "center" > Open-source large language models that run locally on your CPU and nearly any GPU< / p >
2023-09-19 14:51:17 +00:00
2023-05-10 16:05:42 +00:00
< p align = "center" >
2023-10-24 16:04:19 +00:00
< a href = "https://gpt4all.io" > GPT4All Website and Models< / a >
2023-05-10 16:05:42 +00:00
< / p >
2023-05-16 20:15:00 +00:00
< p align = "center" >
< a href = "https://docs.gpt4all.io" > GPT4All Documentation< / a >
< / p >
2023-05-10 16:05:42 +00:00
< p align = "center" >
< a href = "https://discord.gg/mGZE39AS3e" > Discord< / a >
< / p >
2023-03-29 01:07:04 +00:00
< p align = "center" >
2023-05-10 16:05:42 +00:00
< a href = "https://python.langchain.com/en/latest/modules/models/llms/integrations/gpt4all.html" > 🦜️🔗 Official Langchain Backend< / a >
2023-03-29 01:07:04 +00:00
< / p >
2023-03-28 20:04:18 +00:00
2023-04-13 18:33:52 +00:00
< p align = "center" >
GPT4All is made possible by our compute partner < a href = "https://www.paperspace.com/" > Paperspace< / a > .
< / p >
2024-03-18 02:34:19 +00:00
< p align = "center" >
< a href = "https://www.phorm.ai/query?projectId=755eecd3-24ad-49cc-abf4-0ab84caacf63" > < img src = "https://img.shields.io/badge/Phorm-Ask_AI-%23F2777A.svg?&logo=data:image/svg+xml;base64,PHN2ZyB3aWR0aD0iNSIgaGVpZ2h0PSI0IiBmaWxsPSJub25lIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPgogIDxwYXRoIGQ9Ik00LjQzIDEuODgyYTEuNDQgMS40NCAwIDAgMS0uMDk4LjQyNmMtLjA1LjEyMy0uMTE1LjIzLS4xOTIuMzIyLS4wNzUuMDktLjE2LjE2NS0uMjU1LjIyNmExLjM1MyAxLjM1MyAwIDAgMS0uNTk1LjIxMmMtLjA5OS4wMTItLjE5Mi4wMTQtLjI3OS4wMDZsLTEuNTkzLS4xNHYtLjQwNmgxLjY1OGMuMDkuMDAxLjE3LS4xNjkuMjQ2LS4xOTFhLjYwMy42MDMgMCAwIDAgLjItLjEwNi41MjkuNTI5IDAgMCAwIC4xMzgtLjE3LjY1NC42NTQgMCAwIDAgLjA2NS0uMjRsLjAyOC0uMzJhLjkzLjkzIDAgMCAwLS4wMzYtLjI0OS41NjcuNTY3IDAgMCAwLS4xMDMtLjIuNTAyLjUwMiAwIDAgMC0uMTY4LS4xMzguNjA4LjYwOCAwIDAgMC0uMjQtLjA2N0wyLjQzNy43MjkgMS42MjUuNjcxYS4zMjIuMzIyIDAgMCAwLS4yMzIuMDU4LjM3NS4zNzUgMCAwIDAtLjExNi4yMzJsLS4xMTYgMS40NS0uMDU4LjY5Ny0uMDU4Ljc1NEwuNzA1IDRsLS4zNTctLjA3OUwuNjAyLjkwNkMuNjE3LjcyNi42NjMuNTc0LjczOS40NTRhLjk1OC45NTggMCAwIDEgLjI3NC0uMjg1Ljk3MS45NzEgMCAwIDEgLjMzNy0uMTRjLjExOS0uMDI2LjIyNy0uMDM0LjMyNS0uMDI2TDMuMjMyLjE2Yy4xNTkuMDE0LjMzNi4wMy40NTkuMDgyYTEuMTczIDEuMTczIDAgMCAxIC41NDUuNDQ3Yy4wNi4wOTQuMTA5LjE5Mi4xNDQuMjkzYTEuMzkyIDEuMzkyIDAgMCAxIC4wNzguNThsLS4wMjkuMzJaIiBmaWxsPSIjRjI3NzdBIi8+CiAgPHBhdGggZD0iTTQuMDgyIDIuMDA3YTEuNDU1IDEuNDU1IDAgMCAxLS4wOTguNDI3Yy0uMDUuMTI0LS4xMTQuMjMyLS4xOTIuMzI0YTEuMTMgMS4xMyAwIDAgMS0uMjU0LjIyNyAxLjM1MyAxLjM1MyAwIDAgMS0uNTk1LjIxNGMtLjEuMDEyLS4xOTMuMDE0LS4yOC4wMDZsLTEuNTYtLjEwOC4wMzQtLjQwNi4wMy0uMzQ4IDEuNTU5LjE1NGMuMDkgMCAuMTczLS4wMS4yNDgtLjAzM2EuNjAzLjYwMyAwIDAgMCAuMi0uMTA2LjUzMi41MzIgMCAwIDAgLjEzOS0uMTcyLjY2LjY2IDAgMCAwIC4wNjQtLjI0MWwuMDI5LS4zMjFhLjk0Ljk0IDAgMCAwLS4wMzYtLjI1LjU3LjU3IDAgMCAwLS4xMDMtLjIwMi41MDIuNTAyIDAgMCAwLS4xNjgtLjEzOC42MDUuNjA1IDAgMCAwLS4yNC0uMDY3TDEuMjczLjgyN2MtLjA5NC0uMDA4LS4xNjguMDEtLjIyMS4wNTUtLjA1My4wNDUtLjA4NC4xMTQtLjA5Mi4yMDZMLjcwNSA0IDAgMy45MzhsLjI1NS0yLjkxMUExLjAxIDEuMDEgMCAwIDEgLjM5My41NzIuOTYyLjk2MiAwIDAgMSAuNjY2LjI4NmEuOTcuOTcgMCAwIDEgLjMzOC0uMTRDMS4xMjIuMTIgMS4yMy4xMSAxLjMyOC4xMTlsMS41OTMuMTRjLjE2LjAxNC4zLjA0Ny40MjMuMWExLjE3IDEuMTcgMCAwIDEgLjU0NS40NDhjLjA2MS4wOTUuMTA5LjE5My4xNDQuMjk1YTEuNDA2IDEuNDA2IDAgMCAxIC4wNzcuNTgzbC0uMDI4LjMyMloiIGZpbGw9IndoaXRlIi8+CiAgPHBhdGggZD0iTTQuMDgyIDIuMDA3YTEuNDU1IDEuNDU1IDAgMCAxLS4wOTguNDI3Yy0uMDUuMTI0LS4xMTQuMjMyLS4xOTIuMzI0YTEuMTMgMS4xMyAwIDAgMS0uMjU0LjIyNyAxLjM1MyAxLjM1MyAwIDAgMS0uNTk1LjIxNGMtLjEuMDEyLS4xOTMuMDE0LS4yOC4wMDZsLTEuNTYtLjEwOC4wMzQtLjQwNi4wMy0uMzQ4IDEuNTU5LjE1NGMuMDkgMCAuMTczLS4wMS4yNDgtLjAzM2EuNjAzLjYwMyAwIDAgMCAuMi0uMTA2LjUzMi41MzIgMCAwIDAgLjEzOS0uMTcyLjY2LjY2IDAgMCAwIC4wNjQtLjI0MWwuMDI5LS4zMjFhLjk0Ljk0IDAgMCAwLS4wMzYtLjI1LjU3LjU3IDAgMCAwLS4xMDMtLjIwMi41MDIuNTAyIDAgMCAwLS4xNjgtLjEzOC42MDUuNjA1IDAgMCAwLS4yNC0uMDY3TDEuMjczLjgyN2MtLjA5NC0uMDA4LS4xNjguMDEtLjIyMS4wNTUtLjA1My4wNDUtLjA4NC4xMTQtLjA5Mi4yMDZMLjcwNSA0IDAgMy45MzhsLjI1NS0yLjkxMUExLjAxIDEuMDEgMCAwIDEgLjM5My41NzIuOTYyLjk2MiAwIDAgMSAuNjY2LjI4NmEuOTcuOTcgMCAwIDEgLjMzOC0uMTRDMS4xMjIuMTIgMS4yMy4xMSAxLjMyOC4xMTlsMS41OTMuMTRjLjE2LjAxNC4zLjA0Ny40MjMuMWExLjE3IDEuMTcgMCAwIDEgLjU0NS40NDhjLjA2MS4wOTUuMTA5LjE5My4xNDQuMjk1YTEuNDA2IDEuNDA2IDAgMCAxIC4wNzcuNTgzbC0uMDI4LjMyMloiIGZpbGw9IndoaXRlIi8+Cjwvc3ZnPgo=" alt = "phorm.ai" > < / a >
< / p >
2023-05-13 12:02:06 +00:00
< p align = "center" >
< img width = "600" height = "365" src = "https://user-images.githubusercontent.com/13879686/231876409-e3de1934-93bb-4b4b-9013-b491a969ebbc.gif" >
< / p >
< p align = "center" >
2023-06-22 15:48:12 +00:00
Run on an M1 macOS Device (not sped up!)
2023-05-13 12:02:06 +00:00
< / p >
2023-04-13 16:56:08 +00:00
2023-05-13 12:02:06 +00:00
## GPT4All: An ecosystem of open-source on-edge large language models.
2023-10-24 16:12:13 +00:00
> [!IMPORTANT]
> GPT4All v2.5.0 and newer only supports models in GGUF format (.gguf). Models used with a previous version of GPT4All (.bin extension) will no longer work.
2023-10-24 16:04:19 +00:00
GPT4All is an ecosystem to run **powerful** and **customized** large language models that work locally on consumer grade CPUs and any GPU. Note that your CPU needs to support [AVX or AVX2 instructions ](https://en.wikipedia.org/wiki/Advanced_Vector_Extensions ).
2023-04-13 16:56:08 +00:00
2023-05-22 20:08:03 +00:00
Learn more in the [documentation ](https://docs.gpt4all.io ).
2023-10-24 16:12:13 +00:00
A GPT4All model is a 3GB - 8GB file that you can download and plug into the GPT4All open-source ecosystem software. **Nomic AI** supports and maintains this software ecosystem to enforce quality and security alongside spearheading the effort to allow any person or enterprise to easily train and deploy their own on-edge large language models.
2023-05-11 16:31:08 +00:00
2023-10-25 02:03:21 +00:00
### What's New ([Issue Tracker](https://github.com/orgs/nomic-ai/projects/2))
2023-10-24 16:04:19 +00:00
- **October 19th, 2023**: GGUF Support Launches with Support for:
- Mistral 7b base model, an updated model gallery on [gpt4all.io ](https://gpt4all.io ), several new local code models including Rift Coder v1.5
- [Nomic Vulkan ](https://blog.nomic.ai/posts/gpt4all-gpu-inference-with-vulkan ) support for Q4_0, Q6 quantizations in GGUF.
2023-10-24 16:05:46 +00:00
- Offline build support for running old versions of the GPT4All Local LLM Chat Client.
2023-10-24 16:04:19 +00:00
- **September 18th, 2023**: [Nomic Vulkan ](https://blog.nomic.ai/posts/gpt4all-gpu-inference-with-vulkan ) launches supporting local LLM inference on AMD, Intel, Samsung, Qualcomm and NVIDIA GPUs.
- **August 15th, 2023**: GPT4All API launches allowing inference of local LLMs from docker containers.
- **July 2023**: Stable support for LocalDocs, a GPT4All Plugin that allows you to privately and locally chat with your data.
2023-05-11 16:31:08 +00:00
2023-10-25 02:03:21 +00:00
2023-05-10 16:05:42 +00:00
### Chat Client
2023-05-13 12:02:06 +00:00
Run any GPT4All model natively on your home desktop with the auto-updating desktop chat client. See < a href = "https://gpt4all.io" > GPT4All Website< / a > for a full list of open-source models you can run with this powerful desktop application.
2023-05-10 16:05:42 +00:00
Direct Installer Links:
2023-04-13 16:56:08 +00:00
2023-06-22 15:48:12 +00:00
* [macOS ](https://gpt4all.io/installers/gpt4all-installer-darwin.dmg )
2023-04-13 16:56:08 +00:00
2023-05-13 12:02:06 +00:00
* [Windows ](https://gpt4all.io/installers/gpt4all-installer-win64.exe )
2023-04-13 16:56:08 +00:00
2023-05-13 12:02:06 +00:00
* [Ubuntu ](https://gpt4all.io/installers/gpt4all-installer-linux.run )
2023-04-13 16:56:08 +00:00
2023-04-13 20:05:43 +00:00
Find the most up-to-date information on the [GPT4All Website ](https://gpt4all.io/ )
2023-04-13 16:56:08 +00:00
2023-05-14 11:59:50 +00:00
### Chat Client building and running
2023-05-14 21:26:28 +00:00
* Follow the visual instructions on the chat client [build_and_run ](gpt4all-chat/build_and_run.md ) page
2023-05-14 11:59:50 +00:00
2023-05-13 12:02:06 +00:00
### Bindings
2023-05-11 16:31:08 +00:00
2023-08-17 20:56:28 +00:00
* < a href = "https://github.com/nomic-ai/gpt4all/tree/main/gpt4all-bindings/python/README.md" > :snake: Official Python Bindings</ a > [![Downloads ](https://static.pepy.tech/badge/gpt4all/week )](https://pepy.tech/project/gpt4all)
2023-05-14 17:57:00 +00:00
* < a href = "https://github.com/nomic-ai/gpt4all/tree/main/gpt4all-bindings/typescript" > :computer: Official Typescript Bindings</ a >
2023-05-15 16:59:32 +00:00
* < a href = "https://github.com/nomic-ai/gpt4all/tree/main/gpt4all-bindings/golang" > :computer: Official GoLang Bindings</ a >
2023-05-22 20:08:03 +00:00
* < a href = "https://github.com/nomic-ai/gpt4all/tree/main/gpt4all-bindings/csharp" > :computer: Official C# Bindings</ a >
2023-06-12 18:58:06 +00:00
* < a href = "https://github.com/nomic-ai/gpt4all/tree/main/gpt4all-bindings/java" > :computer: Official Java Bindings</ a >
2023-05-11 16:31:08 +00:00
2023-08-23 20:53:30 +00:00
### Integrations
* 🗃️ [Weaviate Vector Database ](https://github.com/weaviate/weaviate ) - [module docs ](https://weaviate.io/developers/weaviate/modules/retriever-vectorizer-modules/text2vec-gpt4all )
2023-04-13 16:56:08 +00:00
2023-05-13 12:02:06 +00:00
## Contributing
2023-05-14 19:48:17 +00:00
GPT4All welcomes contributions, involvement, and discussion from the open source community!
Please see CONTRIBUTING.md and follow the issues, bug reports, and PR markdown templates.
2023-05-13 12:02:06 +00:00
Check project discord, with project owners, or through existing issues/PRs to avoid duplicate work.
Please make sure to tag all of the above with relevant project identifiers or your contribution could potentially get lost.
Example tags: `backend` , `bindings` , `python-bindings` , `documentation` , etc.
2023-05-22 20:08:03 +00:00
## Technical Reports
< p align = "center" >
< a href = "https://gpt4all.io/reports/GPT4All_Technical_Report_3.pdf" > :green_book: Technical Report 3: GPT4All Snoozy and Groovy < / a >
< / p >
< p align = "center" >
< a href = "https://static.nomic.ai/gpt4all/2023_GPT4All-J_Technical_Report_2.pdf" > :green_book: Technical Report 2: GPT4All-J < / a >
< / p >
< p align = "center" >
< a href = "https://s3.amazonaws.com/static.nomic.ai/gpt4all/2023_GPT4All_Technical_Report.pdf" > :green_book: Technical Report 1: GPT4All< / a >
< / p >
2023-05-13 12:02:06 +00:00
2023-04-13 15:16:31 +00:00
## Citation
2023-03-28 21:31:35 +00:00
2023-04-13 15:16:31 +00:00
If you utilize this repository, models or data in a downstream project, please consider citing it with:
2023-03-28 16:00:25 +00:00
```
@misc {gpt4all,
2023-03-28 18:50:27 +00:00
author = {Yuvanesh Anand and Zach Nussbaum and Brandon Duderstadt and Benjamin Schmidt and Andriy Mulyar},
2023-03-28 16:00:25 +00:00
title = {GPT4All: Training an Assistant-style Chatbot with Large Scale Data Distillation from GPT-3.5-Turbo},
year = {2023},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://github.com/nomic-ai/gpt4all}},
}
```