You cannot select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
 
 
 
 
 
 
Go to file
Adam Treat 6ce4089c4f Prelim support for past context. 1 year ago
ggml@c9f702ac57 Initial commit. 1 year ago
icons Initial commit. 1 year ago
.gitignore Initial commit. 1 year ago
.gitmodules Initial commit. 1 year ago
CMakeLists.txt Initial commit. 1 year ago
LICENSE Initial commit. 1 year ago
README.md Update README.md 1 year ago
gptj.cpp Prelim support for past context. 1 year ago
gptj.h Prelim support for past context. 1 year ago
llm.cpp Prelim support for past context. 1 year ago
llm.h Initial commit. 1 year ago
main.cpp Initial commit. 1 year ago
main.qml Naive version of chat context, but slow. 1 year ago

README.md

gpt4all-chat

Cross platform Qt based GUI for GPT4All versions with GPT-J as the base model. NOTE: The model seen in the screenshot is actually a preview of a new training run for GPT4All based on GPT-J. The GPT4All project is busy at work training a new version with GPT-J as the base, but it isn't read just yet. In the meantime, you can try this UI out with the original GPT-J model. Full instructions for how are included below.

image

Features

  • Cross-platform (Linux, Windows, MacOSX, iOS, Android, Embedded Linux, QNX)
  • Fast CPU based inference using ggml for GPT-J based models
  • The UI is made to look and feel like you've come to expect from a chatty gpt
  • Easy to install... The plan is to create precompiled binaries for major platforms with easy installer including model
  • WORK IN PROGRESS!!

Building and running

git clone --recurse-submodules https://github.com/manyoso/gpt4all-chat.git
cd gpt4all-chat
mkdir build
cd build
cmake ..
cmake --build . --parallel
python3 ../ggml/examples/gpt-j/convert-h5-to-ggml.py /path/to/your/local/copy/of/EleutherAI/gpt-j-6B 0
./bin/gpt-j-quantize /path/to/your/local/copy/of/EleutherAI/gpt-j-6B/ggml-model-f32.bin ./ggml-model-q4_0.bin 2
./chat

Contributing

  • Pull requests welcome :)