Commit Graph

963 Commits

Author SHA1 Message Date
Richard Guo
113d04dce3 some cleanup and for job specific names for circleci 2023-05-10 16:40:24 -04:00
Richard Guo
3668cf00cf clean up and jank windows wheel build 2023-05-10 15:58:27 -04:00
Richard Guo
38f5c28b73 why is there no way of stopping pipelines on branches 2023-05-10 14:11:13 -04:00
Richard Guo
d59ae64fa7 fixed paths for c lib 2023-05-10 14:07:56 -04:00
Richard Guo
65292d8721 filter jobs on main branch only 2023-05-10 14:03:13 -04:00
Richard Guo
239a5c14ef refactor circle ci config 2023-05-10 13:57:54 -04:00
Richard Guo
6ee9659905 updated README with new paths 2023-05-10 13:48:36 -04:00
Richard Guo
4cec72fe75 updated path 2023-05-10 13:41:19 -04:00
Richard Guo
8c84c24ee9 transfer python bindings code 2023-05-10 13:38:32 -04:00
AT
f8fdcccc5d
Update README.md 2023-05-10 12:18:45 -04:00
AT
09e161e20f
Update README.md 2023-05-10 12:17:57 -04:00
AT
507a96b2f3
Update README.md 2023-05-10 12:10:33 -04:00
Andriy Mulyar
282204de7c
Create old-README.md 2023-05-10 12:06:43 -04:00
Andriy Mulyar
e97f21000e
Update README.md 2023-05-10 12:05:42 -04:00
Adam Treat
d918b02c29 Move the llmodel C API to new top-level directory and version it. 2023-05-10 11:46:40 -04:00
Andriy Mulyar
2e89a1847a
Merge pull request #520 from nomic-ai/monorepo
Make Monorepos Cool Again 2023
2023-05-10 11:16:36 -04:00
Adam Treat
b00684bb91 Fix ignore for build dirs. 2023-05-10 10:51:47 -04:00
Adam Treat
a971831ed2 Merge commit gpt4all-chat into monorepo 2023-05-10 10:28:36 -04:00
Adam Treat
6015154bef Moving everything to subdir for monorepo merge. 2023-05-10 10:26:55 -04:00
AT
bd0250a6f0
Update README.md
Signed-off-by: AT <manyoso@users.noreply.github.com>
2023-05-10 09:09:29 -04:00
AT
c3062425be
Update README.md
Signed-off-by: AT <manyoso@users.noreply.github.com>
2023-05-10 09:09:04 -04:00
Adam Treat
0f1d4eaa90 Bump the version to 2.4.2 2023-05-10 09:05:39 -04:00
AT
88a0ee3509
Update issue templates 2023-05-09 23:57:06 -04:00
Adam Treat
14412996e3 Fix some usage events. 2023-05-09 23:43:16 -04:00
Adam Treat
45d7967438 Default to true for compat hardware. 2023-05-09 23:17:36 -04:00
AT
8afae10808
Update README.md 2023-05-09 23:10:53 -04:00
AT
b9669a3f10
Update README.md 2023-05-09 23:10:06 -04:00
AT
dd3e0f8679
Update README.md 2023-05-09 23:04:54 -04:00
Adam Treat
64aff8a35b Rename to build_and_run.md 2023-05-09 23:02:41 -04:00
AT
b185b9da0d
Update dev_setup.md 2023-05-09 23:00:50 -04:00
AT
cf61b8259f
Update dev_setup.md 2023-05-09 22:36:02 -04:00
AT
daa0801555
Update dev_setup.md 2023-05-09 22:00:42 -04:00
AT
1a40be68fd
Update dev_setup.md 2023-05-09 21:59:11 -04:00
Adam Treat
999ed1b560 Add a page to fill in for setting up a dev environment. 2023-05-09 21:38:24 -04:00
Adam Treat
80bd55590f Shorten text. 2023-05-09 20:54:16 -04:00
Adam Treat
42926a484f Couple of bugfixes. 2023-05-09 19:15:18 -04:00
Adam Treat
2206fa7f8c Provide a user default model setting and honor it. 2023-05-09 17:10:47 -04:00
Adam Treat
069c243f1a Add MPT info to the download list and fix it so that isDefault will work even if the required version isn't there. 2023-05-09 12:09:49 -04:00
Adam Treat
a13dcfb13b Move this script and rename. 2023-05-09 11:48:32 -04:00
Adam Treat
9c008fb677 Simplify. 2023-05-09 11:46:33 -04:00
Adam Treat
53a39b9ecf Don't keep this in memory when it is not needed. 2023-05-08 21:05:50 -04:00
Adam Treat
5f372bd881 Gracefully handle when we have a previous chat where the model that it used has gone away. 2023-05-08 20:51:03 -04:00
Adam Treat
8b80345c98 Copy pasta. 2023-05-08 19:10:22 -04:00
Adam Treat
af4a67c109 Fix for special im_end token in mpt-7b-chat model. 2023-05-08 18:57:40 -04:00
Adam Treat
d3ec333314 Allow these to load for gptj too. 2023-05-08 18:31:20 -04:00
Aaron Miller
5002614b20 mpt: allow q4_2 quantized models to load 2023-05-08 18:23:36 -04:00
Aaron Miller
832720dd27 mpt tokenizer: better special token handling
closer to the behavior of huggingface `tokenizers`,
do not attempt to handle additional tokens as if they were part
of the original vocabulary as this cannot prevent them from being
split into smaller chunks - handle added tokens *before*
the regular tokenizing pass

note this is still necessary even with a "proper" tokenizer implementation
2023-05-08 18:23:36 -04:00
Adam Treat
8c4b8f215f Fix gptj to have lower memory requirements for kv cache and add versioning to the internal state to smoothly handle such a fix in the future. 2023-05-08 17:23:02 -04:00
Adam Treat
ccbd16cf18 Fix the version. 2023-05-08 16:50:21 -04:00
Adam Treat
a549871220 Remove as upstream has removed. 2023-05-08 15:09:23 -04:00