cebtenzzre
ac33bafb91
docs: improve build_and_run.md ( #1524 )
2023-10-18 11:37:28 -04:00
Aaron Miller
10f9b49313
update mini-orca 3b to gguf2, license
...
Signed-off-by: Aaron Miller <apage43@ninjawhale.com>
2023-10-12 14:57:07 -04:00
niansa/tuxifan
a35f1ab784
Updated chat wishlist ( #1351 )
2023-10-12 14:01:44 -04:00
Adam Treat
908aec27fe
Always save chats to disk, but save them as text by default. This also changes
...
the UI behavior to always open a 'New Chat' and setting it as current instead
of setting a restored chat as current. This improves usability by not requiring
the user to wait if they want to immediately start chatting.
2023-10-12 07:52:11 -04:00
cebtenzzre
04499d1c7d
chatllm: do not write uninitialized data to stream ( #1486 )
2023-10-11 11:31:34 -04:00
Adam Treat
f0742c22f4
Restore state from text if necessary.
2023-10-11 09:16:02 -04:00
Adam Treat
35f9cdb70a
Do not delete saved chats if we fail to serialize properly.
2023-10-11 09:16:02 -04:00
cebtenzzre
9fb135e020
cmake: install the GPT-J plugin ( #1487 )
2023-10-10 15:50:03 -04:00
Aaron Miller
3c25d81759
make codespell happy
2023-10-10 12:00:06 -04:00
Jan Philipp Harries
4f0cee9330
added EM German Mistral Model
2023-10-10 11:44:43 -04:00
Adam Treat
56c0d2898d
Update the language here to avoid misunderstanding.
2023-10-06 14:38:42 -04:00
Adam Treat
b2cd3bdb3f
Fix crasher with an empty string for prompt template.
2023-10-06 12:44:53 -04:00
Cebtenzzre
5fe685427a
chat: clearer CPU fallback messages
2023-10-06 11:35:14 -04:00
Aaron Miller
9325075f80
fix stray comma in models2.json
...
Signed-off-by: Aaron Miller <apage43@ninjawhale.com>
2023-10-05 18:32:23 -04:00
Adam Treat
f028f67c68
Add starcoder, rift and sbert to our models2.json.
2023-10-05 18:16:19 -04:00
Adam Treat
4528f73479
Reorder and refresh our models2.json.
2023-10-05 18:16:19 -04:00
Cebtenzzre
1534df3e9f
backend: do not use Vulkan with non-LLaMA models
2023-10-05 18:16:19 -04:00
Cebtenzzre
672cb850f9
differentiate between init failure and unsupported models
2023-10-05 18:16:19 -04:00
Cebtenzzre
a5b93cf095
more accurate fallback descriptions
2023-10-05 18:16:19 -04:00
Cebtenzzre
75deee9adb
chat: make sure to clear fallback reason on success
2023-10-05 18:16:19 -04:00
Cebtenzzre
2eb83b9f2a
chat: report reason for fallback to CPU
2023-10-05 18:16:19 -04:00
Adam Treat
ea66669cef
Switch to new models2.json for new gguf release and bump our version to
...
2.5.0.
2023-10-05 18:16:19 -04:00
Adam Treat
12f943e966
Fix regenerate button to be deterministic and bump the llama version to latest we have for gguf.
2023-10-05 18:16:19 -04:00
Cebtenzzre
a49a1dcdf4
chatllm: grammar fix
2023-10-05 18:16:19 -04:00
Cebtenzzre
31b20f093a
modellist: fix the system prompt
2023-10-05 18:16:19 -04:00
Cebtenzzre
8f3abb37ca
fix references to removed model types
2023-10-05 18:16:19 -04:00
Adam Treat
d90d003a1d
Latest rebase on llama.cpp with gguf support.
2023-10-05 18:16:19 -04:00
Akarshan Biswas
5f3d739205
appdata: update software description
2023-10-05 10:12:43 -04:00
Akarshan Biswas
b4cf12e1bd
Update to 2.4.19
2023-10-05 10:12:43 -04:00
Akarshan Biswas
21a5709b07
Remove unnecessary stuffs from manifest
2023-10-05 10:12:43 -04:00
Akarshan Biswas
4426640f44
Add flatpak manifest
2023-10-05 10:12:43 -04:00
Aaron Miller
6711bddc4c
launch browser instead of maintenancetool from offline builds
2023-09-27 11:24:21 -07:00
Aaron Miller
7f979c8258
Build offline installers in CircleCI
2023-09-27 11:24:21 -07:00
Adam Treat
dc80d1e578
Fix up the offline installer.
2023-09-18 16:21:50 -04:00
Adam Treat
f47e698193
Release notes for v2.4.19 and bump the version.
2023-09-16 12:35:08 -04:00
Adam Treat
ecf014f03b
Release notes for v2.4.18 and bump the version.
2023-09-16 10:21:50 -04:00
Adam Treat
e6e724d2dc
Actually bump the version.
2023-09-16 10:07:20 -04:00
Adam Treat
06a833e652
Send actual and requested device info for those who have opt-in.
2023-09-16 09:42:22 -04:00
Adam Treat
045f6e6cdc
Link against ggml in bin so we can get the available devices without loading a model.
2023-09-15 14:45:25 -04:00
Adam Treat
655372dbfa
Release notes for v2.4.17 and bump the version.
2023-09-14 17:11:04 -04:00
Adam Treat
aa33419c6e
Fallback to CPU more robustly.
2023-09-14 16:53:11 -04:00
Adam Treat
79843c269e
Release notes for v2.4.16 and bump the version.
2023-09-14 11:24:25 -04:00
Adam Treat
3076e0bf26
Only show GPU when we're actually using it.
2023-09-14 09:59:19 -04:00
Adam Treat
1fa67a585c
Report the actual device we're using.
2023-09-14 08:25:37 -04:00
Adam Treat
21a3244645
Fix a bug where we're not properly falling back to CPU.
2023-09-13 19:30:27 -04:00
Adam Treat
0458c9b4e6
Add version 2.4.15 and bump the version number.
2023-09-13 17:55:50 -04:00
Aaron Miller
6f038c136b
init at most one vulkan device, submodule update
...
fixes issues w/ multiple of the same gpu
2023-09-13 12:49:53 -07:00
Adam Treat
86e862df7e
Fix up the name and formatting.
2023-09-13 15:48:55 -04:00
Adam Treat
358ff2a477
Show the device we're currently using.
2023-09-13 15:24:33 -04:00
Adam Treat
891ddafc33
When device is Auto (the default) then we will only consider discrete GPU's otherwise fallback to CPU.
2023-09-13 11:59:36 -04:00