Commit Graph

1029 Commits

Author SHA1 Message Date
Adam Treat
85bc861835 Fix the alignment. 2023-06-20 17:40:02 -04:00
Adam Treat
eebfe642c4 Add an error message to download dialog if models.json can't be retrieved. 2023-06-20 17:31:36 -04:00
Adam Treat
968868415e Move saving chats to a thread and display what we're doing to the user. 2023-06-20 17:18:33 -04:00
Adam Treat
c8a590bc6f Get rid of last blocking operations and make the chat/llm thread safe. 2023-06-20 18:18:10 -03:00
Adam Treat
84ec4311e9 Remove duplicated state tracking for chatgpt. 2023-06-20 18:18:10 -03:00
Adam Treat
7d2ce06029 Start working on more thread safety and model load error handling. 2023-06-20 14:39:22 -03:00
Adam Treat
d5f56d3308 Forgot to add a signal handler. 2023-06-20 14:39:22 -03:00
Richard Guo
a39a897e34 0.3.5 bump 2023-06-20 10:21:51 -04:00
Richard Guo
25ce8c6a1e revert version 2023-06-20 10:21:51 -04:00
Richard Guo
282a3b5498 setup.py update 2023-06-20 10:21:51 -04:00
Adam Treat
aa2c824258 Initialize these. 2023-06-19 15:38:01 -07:00
Adam Treat
d018b4c821 Make this atomic. 2023-06-19 15:38:01 -07:00
Adam Treat
a3a6a20146 Don't store db results in ChatLLM. 2023-06-19 15:38:01 -07:00
Adam Treat
0cfe225506 Remove this as unnecessary. 2023-06-19 15:38:01 -07:00
Adam Treat
7c28e79644 Fix regenerate response with references. 2023-06-19 17:52:14 -04:00
AT
f76df0deac
Typescript (#1022)
* Show token generation speed in gui.

* Add typescript/javascript to list of highlighted languages.
2023-06-19 16:12:37 -04:00
AT
2b6cc99a31
Show token generation speed in gui. (#1020) 2023-06-19 14:34:53 -04:00
cosmic-snow
fd419caa55
Minor models.json description corrections. (#1013)
Signed-off-by: cosmic-snow <134004613+cosmic-snow@users.noreply.github.com>
2023-06-18 14:10:29 -04:00
cosmic-snow
b00ac632e3
Update python/README.md with troubleshooting info (#1012)
- Add some notes about common Windows problems when trying to make a local build (MinGW and MSVC).

Signed-off-by: cosmic-snow <134004613+cosmic-snow@users.noreply.github.com>
2023-06-18 14:08:43 -04:00
standby24x7
cdea838671
Fix spelling typo in gpt4all.py (#1007)
Signed-off-by: Masanari Iida <standby24x7@gmail.com>
2023-06-18 14:07:46 -04:00
Adam Treat
42e8049564 Bump version and new release notes for metal bugfix edition. 2023-06-16 17:43:10 -04:00
Adam Treat
e2c807d4df Always install metal on apple. 2023-06-16 17:24:20 -04:00
Adam Treat
d5179ac0c0 Fix cmake build. 2023-06-16 17:18:17 -04:00
Adam Treat
d4283c0053 Fix metal and replit. 2023-06-16 17:13:49 -04:00
cosmic-snow
b66d0b4fff
Fix CLI app.py (#910)
- the bindings API changed in 057b9, but the CLI was not updated
- change 'std_passthrough' param to the renamed 'streaming'
- remove '_cli_override_response_callback' as it breaks and is no longer needed
- bump version to 0.3.4
2023-06-16 16:06:22 -04:00
niansa/tuxifan
68f9786ed9
Use operator ""_MiB (#991) 2023-06-16 15:56:22 -04:00
Adam Treat
0a0d4a714e New release and bump the version. 2023-06-16 15:20:23 -04:00
Adam Treat
782e1e77a4 Fix up model names that don't begin with 'ggml-' 2023-06-16 14:43:14 -04:00
Adam Treat
b39a7d4fd9 Fix json. 2023-06-16 14:21:20 -04:00
Adam Treat
6690b49a9f Converts the following to Q4_0
* Snoozy
* Nous Hermes
* Wizard 13b uncensored

Uses the filenames from actual download for these three.
2023-06-16 14:12:56 -04:00
AT
a576220b18
Support loading files if 'ggml' is found anywhere in the name not just at (#1001)
the beginning and add deprecated flag to models.json so older versions will
show a model, but later versions don't. This will allow us to transition
away from models < ggmlv2 and still allow older installs of gpt4all to work.
2023-06-16 11:09:33 -04:00
Aaron Miller
abc081e48d
fix llama.cpp k-quants (#988)
* enable k-quants on *all* mainline builds
2023-06-15 14:06:14 -07:00
Ettore Di Giacinto
b004c53a7b
Allow to set a SetLibrarySearchPath in the golang bindings (#981)
This is used to identify the path where all the various implementations
are
2023-06-14 16:27:19 +02:00
Adam Treat
8953b7f6a6 Fix regression in checked of db and network. 2023-06-13 20:08:46 -04:00
Aaron Miller
c4319d2c8e
dlhandle: prevent libs from using each other's symbols (#977)
use RTLD_LOCAL so that symbols are *only* exposed via dlsym

without this all symbols exported by the libs are available for symbol
resolution, resulting in different lib versions potentially resolving
*each other's* symbols, causing incredibly cursed behavior such as
https://gist.github.com/apage43/085c1ff69f6dd05387793ebc301840f6
2023-06-13 14:52:11 -04:00
Aaron Miller
f71d8efc71
metal replit (#931)
metal+replit

makes replit work with Metal and removes its use of `mem_per_token`
in favor of fixed size scratch buffers (closer to llama.cpp)
2023-06-13 07:29:14 -07:00
Richard Guo
a9b33c3d10 update setup.py 2023-06-13 09:07:08 -04:00
Richard Guo
a99cc34efb fix prompt context so it's preserved in class 2023-06-13 09:07:08 -04:00
Aaron Miller
85964a7635
bump llama.cpp mainline to latest (#964) 2023-06-13 08:40:38 -04:00
Tim Miller
797891c995
Initial Library Loader for .NET Bindings / Update bindings to support newest changes (#763)
* Initial Library Loader

* Load library as part of Model factory

* Dynamically search and find the dlls

* Update tests to use locally built runtimes

* Fix dylib loading, add macos runtime support for sample/tests

* Bypass automatic loading by default.

* Only set CMAKE_OSX_ARCHITECTURES if not already set, allow cross-compile

* Switch Loading again

* Update build scripts for mac/linux

* Update bindings to support newest breaking changes

* Fix build

* Use llmodel for Windows

* Actually, it does need to be libllmodel

* Name

* Remove TFMs, bypass loading by default

* Fix script

* Delete mac script

---------

Co-authored-by: Tim Miller <innerlogic4321@ghmail.com>
2023-06-13 14:05:34 +02:00
Aaron Miller
88616fde7f
llmodel: change tokenToString to not use string_view (#968)
fixes a definite use-after-free and likely avoids some other
potential ones - std::string will convert to a std::string_view
automatically but as soon as the std::string in question goes out of
scope it is already freed and the string_view is pointing at freed
memory - this is *mostly* fine if its returning a reference to the
tokenizer's internal vocab table but it's, imo, too easy to return a
reference to a dynamically constructed string with this as replit is
doing (and unfortunately needs to do to convert the internal whitespace
replacement symbol back to a space)
2023-06-13 07:14:02 -04:00
Felix Zaslavskiy
726dcbd43d
Typo, ignore list (#967)
Fix typo in javadoc,
Add word to ignore list for codespellrc

---------

Co-authored-by: felix <felix@zaslavskiy.net>
2023-06-13 00:53:27 -07:00
Richard Guo
a9bea27537 bring back when condition 2023-06-12 23:11:54 -04:00
Richard Guo
28d1e37d15 let me deploy pls 2023-06-12 23:11:54 -04:00
Richard Guo
5a0b348219 second hold for pypi deploy 2023-06-12 23:11:54 -04:00
Richard Guo
e5cb9f6b7b no need for main 2023-06-12 23:11:54 -04:00
Richard Guo
014205a916 dummy python change 2023-06-12 23:11:54 -04:00
Richard Guo
dcd2bbae9d python workflows in circleci 2023-06-12 23:11:54 -04:00
Richard Guo
e9449190cd version bump 2023-06-12 17:32:56 -04:00
Adam Treat
84deebd223 Fix compile for windows and linux again. PLEASE DON'T REVERT THISgit gui! 2023-06-12 17:08:55 -04:00