Adam Treat
27981c0d21
Fix broken download/remove/install.
2023-07-05 20:12:37 -04:00
Adam Treat
eab92a9d73
Fix typo and add new show references setting to localdocs.
2023-07-05 19:41:23 -04:00
Adam Treat
0638b45b47
Per model prompts / templates.
2023-07-05 16:30:41 -04:00
Adam Treat
1491c9fe49
Fix build on windows.
2023-07-05 15:51:42 -04:00
Adam Treat
6d9cdf228c
Huge change that completely revamps the settings dialog and implements
...
per model settings as well as the ability to clone a model into a "character."
This also implements system prompts as well as quite a few bugfixes for
instance this fixes chatgpt.
2023-07-05 15:51:42 -04:00
Adam Treat
2a6c673c25
Begin redesign of settings dialog.
2023-07-05 15:51:42 -04:00
Adam Treat
dedb0025be
Refactor the settings dialog so that it uses a set of components/abstractions
...
for all of the tabs and stacks
2023-07-05 15:51:42 -04:00
Lakshay Kansal
b3c29e4179
implemented support for bash and go highlighting rules ( #1138 )
...
* implemented support for bash and go
* add more commands to bash
* gave precedence to variables over strings in bash
2023-07-05 11:04:13 -04:00
matthew-gill
fd4081aed8
Update codeblock font
2023-07-05 09:44:25 -04:00
Lakshay Kansal
70cbff70cc
created highlighting rules for java using regex for the gpt4all chat interface
2023-06-29 13:11:37 -03:00
Adam Treat
1cd734efdc
Provide an abstraction to break up the settings dialog into managable pieces.
2023-06-29 09:59:54 -04:00
Adam Treat
7f252b4970
This completes the work of consolidating all settings that can be changed by the user on new settings object.
2023-06-29 00:44:48 -03:00
Adam Treat
285aa50b60
Consolidate generation and application settings on the new settings object.
2023-06-28 20:36:43 -03:00
Adam Treat
7f66c28649
Use the new settings for response generation.
2023-06-28 20:11:24 -03:00
Adam Treat
a8baa4da52
The sync for save should be after.
2023-06-28 20:11:24 -03:00
Adam Treat
705b480d72
Start moving toward a single authoritative class for all settings. This
...
is necessary to get rid of technical debt before we drastically increase
the complexity of settings by adding per model settings and mirostat and
other fun things. Right now the settings are divided between QML and C++
and some convenience methods to deal with settings sync and so on that are
in other singletons. This change consolidates all the logic for settings
into a single class with a single API for both C++ and QML.
2023-06-28 20:11:24 -03:00
Adam Treat
e70899a26c
Make the retrieval/parsing of models.json sync on startup. We were jumping to many hoops to mitigate the async behavior.
2023-06-28 12:32:22 -03:00
Adam Treat
9560336490
Match on the filename too for server mode.
2023-06-28 09:20:05 -04:00
Adam Treat
58cd346686
Bump release again and new release notes.
2023-06-27 18:01:23 -04:00
Adam Treat
0f8f364d76
Fix mac again for falcon.
2023-06-27 17:20:40 -04:00
Adam Treat
8aae4e52b3
Fix for falcon on mac.
2023-06-27 17:13:13 -04:00
Adam Treat
9375c71aa7
New release notes for 2.4.9 and bump version.
2023-06-27 17:01:49 -04:00
Adam Treat
71449bbc4b
Fix this correctly?
2023-06-27 16:01:11 -04:00
Adam Treat
07a5405618
Make it clear this is our finetune.
2023-06-27 15:33:38 -04:00
Adam Treat
189ac82277
Fix server mode.
2023-06-27 15:01:16 -04:00
Adam Treat
b56cc61ca2
Don't allow setting an invalid prompt template.
2023-06-27 14:52:44 -04:00
Adam Treat
0780393d00
Don't use local.
2023-06-27 14:13:42 -04:00
Adam Treat
924efd9e25
Add falcon to our models.json
2023-06-27 13:56:16 -04:00
Adam Treat
d3b8234106
Fix spelling.
2023-06-27 14:23:56 -03:00
Adam Treat
42c0a6673a
Don't persist the force metal setting.
2023-06-27 14:23:56 -03:00
Adam Treat
267601d670
Enable the force metal setting.
2023-06-27 14:23:56 -03:00
Aaron Miller
e22dd164d8
add falcon to chatllm::serialize
2023-06-27 14:06:39 -03:00
Aaron Miller
198b5e4832
add Falcon 7B model
...
Tested with https://huggingface.co/TheBloke/falcon-7b-instruct-GGML/blob/main/falcon7b-instruct.ggmlv3.q4_0.bin
2023-06-27 14:06:39 -03:00
Adam Treat
985d3bbfa4
Add Orca models to list.
2023-06-27 09:38:43 -04:00
Adam Treat
8558fb4297
Fix models.json for spanning multiple lines with string.
2023-06-26 21:35:56 -04:00
Adam Treat
c24ad02a6a
Wait just a bit to set the model name so that we can display the proper name instead of filename.
2023-06-26 21:00:09 -04:00
Adam Treat
57fa8644d6
Make spelling check happy.
2023-06-26 17:56:56 -04:00
Adam Treat
d0a3e82ffc
Restore feature I accidentally erased in modellist update.
2023-06-26 17:50:45 -04:00
Aaron Miller
b19a3e5b2c
add requiredMem method to llmodel impls
...
most of these can just shortcut out of the model loading logic llama is a bit worse to deal with because we submodule it so I have to at least parse the hparams, and then I just use the size on disk as an estimate for the mem size (which seems reasonable since we mmap() the llama files anyway)
2023-06-26 18:27:58 -03:00
Adam Treat
dead954134
Fix save chats setting.
2023-06-26 16:43:37 -04:00
Adam Treat
26c9193227
Sigh. Windows.
2023-06-26 16:34:35 -04:00
Adam Treat
5deec2afe1
Change this back now that it is ready.
2023-06-26 16:21:09 -04:00
Adam Treat
676248fe8f
Update the language.
2023-06-26 14:14:49 -04:00
Adam Treat
ef92492d8c
Add better warnings and links.
2023-06-26 14:14:49 -04:00
Adam Treat
71c972f8fa
Provide a more stark warning for localdocs and add more size to dialogs.
2023-06-26 14:14:49 -04:00
Adam Treat
1b5aa4617f
Enable the add button always, but show an error in placeholder text.
2023-06-26 14:14:49 -04:00
Adam Treat
a0f80453e5
Use sysinfo in backend.
2023-06-26 14:14:49 -04:00
Adam Treat
5e520bb775
Fix so that models are searched in subdirectories.
2023-06-26 14:14:49 -04:00
Adam Treat
64e98b8ea9
Fix bug with model loading on initial load.
2023-06-26 14:14:49 -04:00
Adam Treat
3ca9e8692c
Don't try and load incomplete files.
2023-06-26 14:14:49 -04:00
Adam Treat
27f25d5878
Get rid of recursive mutex.
2023-06-26 14:14:49 -04:00
Adam Treat
7f01b153b3
Modellist temp
2023-06-26 14:14:46 -04:00
Adam Treat
c1794597a7
Revert "Enable Wayland in build"
...
This reverts commit d686a583f9
.
2023-06-26 14:10:27 -04:00
Akarshan Biswas
d686a583f9
Enable Wayland in build
...
# Describe your changes
The patch include support for running natively on a Linux Wayland display server/compositor which is successor to old Xorg.
Cmakelist was missing WaylandClient so added it back.
Will fix #1047 .
Signed-off-by: Akarshan Biswas <akarshan.biswas@gmail.com>
2023-06-26 14:58:23 -03:00
AMOGUS
3417a37c54
Change "web server" to "API server" for less confusion ( #1039 )
...
* Change "Web server" to "API server"
* Changed "API server" to "OpenAPI server"
* Reversed back to "API server" and updated tooltip
2023-06-23 16:28:52 -04:00
cosmic-snow
a423075403
Allow Cross-Origin Resource Sharing (CORS) ( #1008 )
2023-06-22 09:19:49 -07:00
Martin Mauch
af28173a25
Parse Org Mode files ( #1038 )
2023-06-22 09:09:39 -07:00
niansa/tuxifan
01acb8d250
Update download speed less often
...
To not show every little tiny network spike to the user
Signed-off-by: niansa/tuxifan <tuxifan@posteo.de>
2023-06-22 09:29:15 +02:00
Adam Treat
09ae04cee9
This needs to work even when localdocs and codeblocks are detected.
2023-06-20 19:07:02 -04:00
Adam Treat
ce7333029f
Make the copy button a little more tolerant.
2023-06-20 18:59:08 -04:00
Adam Treat
508993de75
Exit early when no chats are saved.
2023-06-20 18:30:17 -04:00
Adam Treat
85bc861835
Fix the alignment.
2023-06-20 17:40:02 -04:00
Adam Treat
eebfe642c4
Add an error message to download dialog if models.json can't be retrieved.
2023-06-20 17:31:36 -04:00
Adam Treat
968868415e
Move saving chats to a thread and display what we're doing to the user.
2023-06-20 17:18:33 -04:00
Adam Treat
c8a590bc6f
Get rid of last blocking operations and make the chat/llm thread safe.
2023-06-20 18:18:10 -03:00
Adam Treat
84ec4311e9
Remove duplicated state tracking for chatgpt.
2023-06-20 18:18:10 -03:00
Adam Treat
7d2ce06029
Start working on more thread safety and model load error handling.
2023-06-20 14:39:22 -03:00
Adam Treat
d5f56d3308
Forgot to add a signal handler.
2023-06-20 14:39:22 -03:00
Adam Treat
aa2c824258
Initialize these.
2023-06-19 15:38:01 -07:00
Adam Treat
d018b4c821
Make this atomic.
2023-06-19 15:38:01 -07:00
Adam Treat
a3a6a20146
Don't store db results in ChatLLM.
2023-06-19 15:38:01 -07:00
Adam Treat
0cfe225506
Remove this as unnecessary.
2023-06-19 15:38:01 -07:00
Adam Treat
7c28e79644
Fix regenerate response with references.
2023-06-19 17:52:14 -04:00
AT
f76df0deac
Typescript ( #1022 )
...
* Show token generation speed in gui.
* Add typescript/javascript to list of highlighted languages.
2023-06-19 16:12:37 -04:00
AT
2b6cc99a31
Show token generation speed in gui. ( #1020 )
2023-06-19 14:34:53 -04:00
cosmic-snow
fd419caa55
Minor models.json description corrections. ( #1013 )
...
Signed-off-by: cosmic-snow <134004613+cosmic-snow@users.noreply.github.com>
2023-06-18 14:10:29 -04:00
Adam Treat
42e8049564
Bump version and new release notes for metal bugfix edition.
2023-06-16 17:43:10 -04:00
Adam Treat
e2c807d4df
Always install metal on apple.
2023-06-16 17:24:20 -04:00
Adam Treat
d5179ac0c0
Fix cmake build.
2023-06-16 17:18:17 -04:00
Adam Treat
d4283c0053
Fix metal and replit.
2023-06-16 17:13:49 -04:00
Adam Treat
0a0d4a714e
New release and bump the version.
2023-06-16 15:20:23 -04:00
Adam Treat
782e1e77a4
Fix up model names that don't begin with 'ggml-'
2023-06-16 14:43:14 -04:00
Adam Treat
b39a7d4fd9
Fix json.
2023-06-16 14:21:20 -04:00
Adam Treat
6690b49a9f
Converts the following to Q4_0
...
* Snoozy
* Nous Hermes
* Wizard 13b uncensored
Uses the filenames from actual download for these three.
2023-06-16 14:12:56 -04:00
AT
a576220b18
Support loading files if 'ggml' is found anywhere in the name not just at ( #1001 )
...
the beginning and add deprecated flag to models.json so older versions will
show a model, but later versions don't. This will allow us to transition
away from models < ggmlv2 and still allow older installs of gpt4all to work.
2023-06-16 11:09:33 -04:00
Adam Treat
8953b7f6a6
Fix regression in checked of db and network.
2023-06-13 20:08:46 -04:00
Aaron Miller
88616fde7f
llmodel: change tokenToString to not use string_view ( #968 )
...
fixes a definite use-after-free and likely avoids some other
potential ones - std::string will convert to a std::string_view
automatically but as soon as the std::string in question goes out of
scope it is already freed and the string_view is pointing at freed
memory - this is *mostly* fine if its returning a reference to the
tokenizer's internal vocab table but it's, imo, too easy to return a
reference to a dynamically constructed string with this as replit is
doing (and unfortunately needs to do to convert the internal whitespace
replacement symbol back to a space)
2023-06-13 07:14:02 -04:00
Adam Treat
68ff7001ad
Bugfixes for prompt syntax highlighting.
2023-06-12 05:55:14 -07:00
Adam Treat
60d95cdd9b
Fix some bugs with bash syntax and add some C23 keywords.
2023-06-12 05:08:18 -07:00
Adam Treat
e986f18904
Add c++/c highighting support.
2023-06-12 05:08:18 -07:00
Adam Treat
ae46234261
Spelling error.
2023-06-11 14:20:05 -07:00
Adam Treat
318c51c141
Add code blocks and python syntax highlighting.
2023-06-11 14:20:05 -07:00
Adam Treat
b67cba19f0
Don't interfere with selection.
2023-06-11 14:20:05 -07:00
Adam Treat
50c5b82e57
Clean up the context links a bit.
2023-06-11 14:20:05 -07:00
AT
a9c2f47303
Add new solution for context links that does not force regular markdown ( #938 )
...
in responses which is disruptive to code completions in responses.
2023-06-10 10:15:38 -04:00
Aaron Miller
d3ba1295a7
Metal+LLama take two ( #929 )
...
Support latest llama with Metal
---------
Co-authored-by: Adam Treat <adam@nomic.ai>
Co-authored-by: niansa/tuxifan <tuxifan@posteo.de>
2023-06-09 16:48:46 -04:00
Adam Treat
b162b5c64e
Revert "llama on Metal ( #885 )"
...
This reverts commit c55f81b860
.
2023-06-09 15:08:46 -04:00
Aaron Miller
c55f81b860
llama on Metal ( #885 )
...
Support latest llama with Metal
---------
Co-authored-by: Adam Treat <adam@nomic.ai>
Co-authored-by: niansa/tuxifan <tuxifan@posteo.de>
2023-06-09 14:58:12 -04:00
pingpongching
0d0fae0ca8
Change the default values for generation in GUI
2023-06-09 08:51:09 -04:00
Adam Treat
8fb73c2114
Forgot to bump.
2023-06-09 08:45:31 -04:00