Commit Graph

267 Commits

Author SHA1 Message Date
Adam Treat
1f3d4e487f Bump the version. 2023-04-29 08:56:53 -04:00
Adam Treat
f70a549975 New version of icns made on a mac. 2023-04-29 08:40:54 -04:00
Adam Treat
6d0bae5362 Add 1024 resolution to icns. 2023-04-29 04:39:55 -04:00
Adam Treat
2e21fb9c81 Fixup icns 2023-04-29 04:38:36 -04:00
Adam Treat
7d0970cbe2 Rework the icon a bit to more closely match macOS style guidelines. 2023-04-29 04:31:06 -04:00
Adam Treat
264408f5bd Always hardcode. 2023-04-29 04:06:26 -04:00
Adam Treat
edbd48fe22 Require a direct choice for opt-in 2023-04-29 03:55:06 -04:00
Adam Treat
6bdd866b6d Always hardcode. 2023-04-28 22:46:01 -04:00
Adam Treat
90821b6581 Fixup. 2023-04-28 22:37:59 -04:00
Adam Treat
c9c8c60db7 Try to fix uninstall of symlink. 2023-04-28 22:28:11 -04:00
Adam Treat
a018a3176e Set the folder when the browse opens 2023-04-28 22:24:59 -04:00
Adam Treat
42cf4e9899 Force ini format for all platforms. 2023-04-28 22:21:23 -04:00
Adam Treat
f2b1000fcc No need to install so many icons. 2023-04-28 22:10:41 -04:00
Adam Treat
49d55dd295 Don't delete symlink unless we're uninstalling. 2023-04-28 22:07:37 -04:00
Adam Treat
8bdbcfeaa6 Remove symlink when uninstalling. 2023-04-28 21:51:39 -04:00
Adam Treat
f3663cc55e Fix the icons more. 2023-04-28 21:48:10 -04:00
Adam Treat
7c9b936408 Fix icons. 2023-04-28 21:40:45 -04:00
Adam Treat
6df4f8783f Correct the macOS symlink. 2023-04-28 21:26:38 -04:00
Adam Treat
312f1dc354 Fix icons and try to make macOS experience happier. 2023-04-28 21:19:12 -04:00
Aaron Miller
c274a03fe7 use C locale for DoubleValidator
Closes https://github.com/nomic-ai/gpt4all-chat/issues/126
2023-04-28 20:45:40 -04:00
Aaron Miller
3db5337ed5 put chat.exe in 'bin' folder of build tree
because this is also in llama.cpp's CMakeLists:
https://github.com/ggerganov/llama.cpp/blob/master/CMakeLists.txt#L11
this is where libllama.dll winds up, causing attempts to run the chat UI
from Qt Creator on Windows to fail due to not finding libllama.dll - I've been
working around this by copying libllama.dll *out* of bin/ but have been
bitten a few times by forgetting to keep doing that and the build getting
out of sync.
2023-04-28 20:45:02 -04:00
Adam Treat
16633b17b8 Convert new ico and icns logos. 2023-04-28 20:40:35 -04:00
Adam Treat
464ba49ce6 Add a requires field for the models.json for future proofing. 2023-04-28 20:30:52 -04:00
Adam Treat
2a5b34b193 Load models from filepath only. 2023-04-28 20:15:10 -04:00
Adam Treat
187e358789 Update ignore. 2023-04-28 14:11:56 -04:00
Adam Treat
5aecb3c0e2 Fix bug with startup order and new logos. 2023-04-28 14:11:18 -04:00
Adam Treat
70ab18f644 Update to latest llama.cpp 2023-04-28 11:03:16 -04:00
Adam Treat
812431f78d New startup dialog features. 2023-04-28 11:03:16 -04:00
Adam Treat
ac2aba313a Fix settings dialog to use onClosed handler. 2023-04-28 11:03:16 -04:00
Aaron Miller
d224d2a2f7 make download path in settings match dl dialog 2023-04-27 17:41:38 -04:00
Adam Treat
83f08b6c29 Small fix. 2023-04-27 16:45:24 -04:00
Adam Treat
a6679b18bd Have to be able to change the download path from the download dialog and other fixes. 2023-04-27 16:27:53 -04:00
Adam Treat
6f94c7e84b Provide a description and make the downloader cleaner and prettier. 2023-04-27 14:52:40 -04:00
Adam Treat
c6c5e0bb4f Always try and load default model first. Groovy is the default default. 2023-04-27 13:52:29 -04:00
Adam Treat
83fb05345e Make the input area wrap automatically. 2023-04-27 11:54:53 -04:00
Adam Treat
80bcbcd137 Silence warning. 2023-04-27 11:44:41 -04:00
Adam Treat
a3253c4ab1 Move the saving of the tokens to the impl and not the callbacks responsibility. 2023-04-27 11:16:51 -04:00
Adam Treat
9a65f73392 Move the promptCallback to own function. 2023-04-27 11:08:15 -04:00
Adam Treat
ebf660d2bd Provide an initial impl. of the C interface. NOTE: has not been tested. 2023-04-27 09:43:24 -04:00
Adam Treat
5c3c1317f8 Track check for updates. 2023-04-27 07:41:23 -04:00
Adam Treat
368cd8e119 Add this and unbreak the build. 2023-04-26 22:45:10 -04:00
Aaron Miller
d1c69e8234 download: don't read whole file into ram to md5 it
we go to the trouble of using a tempfile and then reintroduce
a case of reading the whole file into ram again?
2023-04-26 22:14:21 -04:00
Aaron Miller
1f371be395 download: atomically move tempfile when possible
should save unnecessary time and I/O and eliminate the possibility
of the file being improperly truncated when the temp file is on
the same filesystem as the destination path
2023-04-26 22:14:21 -04:00
Adam Treat
eafb98b3a9 Initial support for opt-in telemetry. 2023-04-26 22:05:56 -04:00
Adam Treat
70e6b45123 Don't crash when prompt is too large. 2023-04-26 19:08:37 -04:00
Adam Treat
562906da11 Unnecessary after all. 2023-04-26 18:35:53 -04:00
Adam Treat
c409dbfa7a Put this before. 2023-04-26 13:54:25 -04:00
Adam Treat
65f15acc41 Signing ident. 2023-04-26 13:33:33 -04:00
Adam Treat
b04ab8fb5c Update llama.cpp submodule to latest. 2023-04-26 11:50:05 -04:00
Adam Treat
76a32c2fcd Add back option. 2023-04-26 11:02:05 -04:00