Commit Graph

980 Commits

Author SHA1 Message Date
Adam Treat
5b71d39024 Don't crash if state has not been set. 2023-05-05 10:00:17 -04:00
Richard Guo
7ab7d948b5
Update monorepo_plan.md 2023-05-05 09:32:45 -04:00
Aaron Miller
019f6d0103 include <cstdint> in llmodel.h 2023-05-04 20:36:19 -04:00
Adam Treat
f291853e51 First attempt at providing a persistent chat list experience.
Limitations:

1) Context is not restored for gpt-j models
2) When you switch between different model types in an existing chat
   the context and all the conversation is lost
3) The settings are not chat or conversation specific
4) The sizes of the chat persisted files are very large due to how much
   data the llama.cpp backend tries to persist. Need to investigate how
   we can shrink this.
2023-05-04 15:31:41 -04:00
Adam Treat
081d32bd97 Restore the model when switching chats. 2023-05-03 12:45:14 -04:00
Adam Treat
0bb52fc5fe Experiment with a much shorter default prompt template. 2023-05-03 12:19:14 -04:00
Adam Treat
82c1d08b33 Add reverse prompts for llama models. 2023-05-03 11:58:26 -04:00
Adam Treat
01accf9e33 Don't exceed the window size for dialogs. 2023-05-03 08:37:45 -04:00
Adam Treat
0f70289ba4 Changes the datalake feature so all conversations are captured when opted-in. 2023-05-03 07:54:45 -04:00
Aaron Miller
edad3baa99 download: make model downloads resumable
* save files as `incomplete-{filename}` in the dest folder
* rename into place after hash is confirmed or delete if hash is bad
* resume downloads using http `range`
* if DL is resumed from a different app session rewind a bit -
  this is to deal with the case where the file size changes before
  the content is fully flushed out
* flush dest file at end of readyRead, this mitigates the above
  and provides backpressure on the download if the destination disk
  is slower than the network connection
2023-05-02 20:36:25 -04:00
Adam Treat
4a09f0f0ec More extensive usage stats to help diagnose errors and problems in the ui. 2023-05-02 20:31:17 -04:00
Adam Treat
cb085a6418 Some more startup info to help determine what hardware we need to support. 2023-05-02 16:24:06 -04:00
Adam Treat
21dc522200 Don't block the GUI when reloading via combobox. 2023-05-02 15:02:25 -04:00
Adam Treat
48837a62fa Provide a confirm button for deletion of chats. 2023-05-02 12:36:21 -04:00
Adam Treat
bb3e08e3dd Use different border colors if we're current or being edited. 2023-05-02 11:34:39 -04:00
Adam Treat
f4f27fc38f Update the right index when removing. 2023-05-02 11:26:21 -04:00
Adam Treat
f13f4f4700 Generate names via llm. 2023-05-02 11:19:17 -04:00
Zach Nussbaum
2c8e1096c5
Merge pull request #472 from berkantay/main
Update README.md
2023-05-02 10:15:40 -04:00
Adam Treat
a62fafc308 Always have a chat. 2023-05-02 09:07:28 -04:00
Adam Treat
86132cfc8b Don't add new chats willy nilly. 2023-05-02 07:53:09 -04:00
Adam Treat
118e0bdc44 Allow removing chats. 2023-05-01 20:56:53 -04:00
Adam Treat
412cad99f2 Hot swapping of conversations. Destroys context for now. 2023-05-01 20:27:07 -04:00
Adam Treat
a48226613c Turn the chat list into a model. 2023-05-01 17:13:20 -04:00
Richard Guo
b5df9c7cb1 rough draft of monorepo plan 2023-05-01 15:45:39 -04:00
Richard Guo
02d1bdb0be mono repo structure 2023-05-01 15:45:23 -04:00
Adam Treat
679b61ee07 Provide convenience methods for adding/removing/changing chat. 2023-05-01 14:24:16 -04:00
Adam Treat
8f80f8e3a2 Break out the drawer into own component. 2023-05-01 13:51:46 -04:00
Adam Treat
6e6b96375d Handle the fwd of important signals from LLM object so qml doesn't have to deal with which chat is current. 2023-05-01 12:41:03 -04:00
Adam Treat
c0d4a9d426 Continue to shrink the API space for qml and the backend. 2023-05-01 12:30:54 -04:00
Adam Treat
ed59190e48 Consolidate these into single api from qml to backend. 2023-05-01 12:24:51 -04:00
Adam Treat
4d87c46948 Major refactor in prep for multiple conversations. 2023-05-01 09:10:05 -04:00
Adam Treat
e005ab8c0a Move the reset and id into the chat object. 2023-04-30 21:05:54 -04:00
Adam Treat
d1e3198b65 Add new C++ version of the chat model. Getting ready for chat history. 2023-04-30 20:28:43 -04:00
AT
65d4b8a886
Update README.md 2023-04-30 16:07:59 -04:00
AT
d3d8229b04
Update README.md 2023-04-30 09:05:26 -04:00
AT
8e696bb4e4
Update README.md 2023-04-30 08:54:45 -04:00
Adam Treat
9f323759ce Remove these as it is mitigated by repeat penalty and models really should train this out. 2023-04-30 08:02:39 -04:00
AT
ef2e1bd4fe
Update README.md 2023-04-30 07:07:22 -04:00
AT
f37e9f9039
Update README.md 2023-04-30 07:02:01 -04:00
Adam Treat
13401fc52f Bump the version. 2023-04-29 21:04:47 -04:00
Adam Treat
a6ca45c9dd Use the universal sep. 2023-04-29 21:03:10 -04:00
Berkant
aefea2e713
Update README.md
README.md typo fix.
2023-04-30 01:07:14 +03:00
AT
573e4e1f73
Update README.md 2023-04-29 17:49:18 -04:00
AT
84ffd801ec
Update README.md 2023-04-29 17:48:00 -04:00
Adam Treat
727a74de6c Make an offline installer option. 2023-04-29 12:13:11 -04:00
Adam Treat
c4d312ae11 Don't attempt to send shutdown which won't work anyway. 2023-04-29 11:07:14 -04:00
Adam Treat
8aed93daa5 Send optout for real and only once. 2023-04-29 11:05:44 -04:00
Adam Treat
8fe60c29fb Don't set the app version in the llmodel. 2023-04-29 10:31:12 -04:00
Adam Treat
9ebf2537fa Bump the version. 2023-04-29 08:56:53 -04:00
Adam Treat
3cf8f0da13 New version of icns made on a mac. 2023-04-29 08:40:54 -04:00