Adam Treat
9c85a2ceb2
Add multi-line prompt support.
2023-04-20 08:31:33 -04:00
Adam Treat
963ef4b617
Pin the llama.cpp to a slightly older version.
2023-04-20 07:34:15 -04:00
Adam Treat
795715fb59
Don't crash starting with no model.
2023-04-20 07:17:07 -04:00
Adam Treat
338b5ca703
Don't use versions for model downloader.
2023-04-20 06:48:13 -04:00
eachadea
b36e235112
Don't build a universal binary
...
unless -DBUILD_UNIVERSAL=ON
2023-04-20 06:37:54 -04:00
Adam Treat
71b308e914
Add llama.cpp support for loading llama based models in the gui. We now
...
support loading both gptj derived models and llama derived models.
2023-04-20 06:19:09 -04:00
Aaron Miller
00cb5fe2a5
Add thread count setting
2023-04-19 08:33:13 -04:00
Adam Treat
169afbdc80
Add a new model download feature.
2023-04-18 21:10:06 -04:00
Adam Treat
2b1cae5a7e
Allow unloading/loading/changing of models.
2023-04-18 11:42:38 -04:00
Aaron Miller
8a4f7897f4
remove fill color for prompt template box
2023-04-18 08:47:37 -04:00
Adam Treat
708bc1bbe6
Fix link color to have consistency across platforms.
2023-04-18 08:45:21 -04:00
Adam Treat
e540edcb1e
Make the gui accessible.
2023-04-18 08:40:04 -04:00
Pavol Rusnak
da0912d531
readme: GPL -> MIT license
2023-04-17 16:45:29 -04:00
Adam Treat
99d4a7f573
Changing to MIT license.
2023-04-17 16:37:50 -04:00
Adam Treat
74757d4d1b
Don't add version number to the installer or the install location.
2023-04-17 15:59:14 -04:00
Adam Treat
cf27c3b1a7
Bump the version for the context bug fix.
2023-04-17 15:37:24 -04:00
Adam Treat
f73fbf28a4
Fix the context.
2023-04-17 14:11:41 -04:00
Adam Treat
b0ce635338
Set a new default temp that is more conservative.
2023-04-17 09:49:59 -04:00
AT
0ad1e21f78
Update README.md
2023-04-17 09:02:26 -04:00
Adam Treat
90e1eb0c37
Update submodule.
2023-04-17 08:04:40 -04:00
Adam Treat
29990fd27b
Bump version to 2.1 as this has been referred to far and wide as
...
GPT4All v2 so doing this to decrease confusion. Also, making the version
number visible in the title bar.
2023-04-17 07:50:39 -04:00
Adam Treat
c924acffa7
Update the bundled model name.
2023-04-16 22:10:26 -04:00
Adam Treat
cbf44d61b0
New version.
2023-04-16 19:20:43 -04:00
Adam Treat
a7c2d65824
Don't allow empty prompts. Context past always equal or greater than zero.
2023-04-16 14:57:58 -04:00
Adam Treat
4bf4b2a080
Trim trailing whitespace at the end of generation.
2023-04-16 14:19:59 -04:00
Adam Treat
9381a69b2b
Remove newlines too.
2023-04-16 14:04:25 -04:00
Adam Treat
b39acea516
More conservative default params and trim leading whitespace from response.
2023-04-16 13:56:56 -04:00
TheBloke
f63a2df715
Change the example CLI prompt to something more appropriate, as this is not a Llama model! :)
2023-04-16 12:52:23 -04:00
TheBloke
a03bac57bf
Fix repo name
2023-04-16 12:52:23 -04:00
TheBloke
032bc11d36
Update README to include instructions for building CLI only
...
Users may want to play around with gpt4all-j from the command line. But they may not have Qt, and might not want to get it, or may find it very hard to do so - eg when using a Google Colab or similar hosted service.
It's easy to build the CLI tools just by building the `ggml` sub folder. So this commit adds instructions on doing that, including an example invocation of the `gpt-j` binary.
2023-04-16 12:52:23 -04:00
TheBloke
e85f5ee9b8
Update .gitignore to ignore a local build
directory.
2023-04-16 12:52:23 -04:00
TheBloke
7c3f8c2168
Update git clone command in README to point to main nomic repo=
...
I'm not sure if it was intentional that the build instructions tell the user to clone `manyoso/gpt4all-chat.git`?
But I would think this should be cloning the main repo at `nomic-ai/gpt4all-chat` instead. Otherwise users following this command might get changes not yet merged into the main repo, which could be confusing.
2023-04-16 12:52:23 -04:00
AT
33e9f350e5
Update README.md
2023-04-16 11:53:02 -04:00
Adam Treat
ef789b354b
Rearrange the buttons and provide a message what the copy button does.
2023-04-16 11:44:55 -04:00
Adam Treat
f19abd6a18
Check for ###Prompt: or ###Response and stop generating and modify the default template a little bit.
2023-04-16 11:25:48 -04:00
Aaron Miller
7ec47c659b
add tooltips to settings dialog
2023-04-16 11:16:30 -04:00
Aaron Miller
8204e7d047
add "restore defaults" button
2023-04-16 11:16:30 -04:00
Aaron Miller
5bfb3f8229
use the settings dialog settings when generating
2023-04-16 11:16:30 -04:00
Aaron Miller
098ab803a0
add settings dialog
2023-04-16 11:16:30 -04:00
Aaron Miller
f89a1f6ef5
add settings icon
2023-04-16 11:16:30 -04:00
Adam Treat
a77946e745
Provide an instruct/chat template.
2023-04-15 16:33:37 -04:00
Aaron Miller
391904efae
Use completeBaseName to display model name
...
this cuts the filename at the *final* dot instead of the first, allowing
model names with version numbers to be displayed correctly.
2023-04-15 13:29:51 -04:00
Adam Treat
078b755ab8
Erase the correct amount of logits when regenerating which is not the same
...
as the number of tokens.
2023-04-15 09:19:54 -04:00
Adam Treat
b1bb9866ab
Fix crash with recent change to erase context.
2023-04-15 09:10:34 -04:00
Adam Treat
1c5dd6710d
When regenerating erase the previous response and prompt from the context.
2023-04-15 09:10:27 -04:00
AT
509f377f5c
Merge pull request #28 from TheBloke/macOS_Universal
...
Add support for building a Universal binary on macOS
2023-04-14 14:06:47 -04:00
TheBloke
65c3315bda
Remove Qt dir
2023-04-14 17:33:54 +01:00
TheBloke
35ed43b2c2
Remove test debug lines
2023-04-14 17:28:44 +01:00
TheBloke
9fbaeaaeb6
Add support for building a Universal binary on macOS
2023-04-14 17:19:03 +01:00
Zach Nussbaum
1a372a11c2
Update README.md
2023-04-14 06:17:02 -07:00