Commit Graph

5 Commits

Author SHA1 Message Date
Richard Guo
7a472bea88 Replit Model (#713)
* porting over replit code model to gpt4all

* replaced memory with kv_self struct

* continuing debug

* welp it built but lot of sus things

* working model loading and somewhat working generate.. need to format response?

* revert back to semi working version

* finally got rid of weird formatting

* figured out problem is with python bindings - this is good to go for testing

* addressing PR feedback

* output refactor

* fixed prompt reponse collection

* cleanup

* addressing PR comments

* building replit backend with new ggmlver code

* chatllm replit and clean python files

* cleanup

* updated replit to match new llmodel api

* match llmodel api and change size_t to Token

* resolve PR comments

* replit model commit comment
2023-06-06 17:09:00 -04:00
Adam Treat
4a317eeb33 Revert "New tokenizer implementation for MPT and GPT-J"
This reverts commit ee3469ba6c.
2023-05-30 12:59:00 -04:00
Aaron Miller
ee3469ba6c New tokenizer implementation for MPT and GPT-J
Improves output quality by making these tokenizers more closely
match the behavior of the huggingface `tokenizers` based BPE
tokenizers these models were trained with.

Featuring:
 * Fixed unicode handling (via ICU)
 * Fixed BPE token merge handling
 * Complete added vocabulary handling
2023-05-30 12:05:57 -04:00
Zach Nussbaum
53730c5f7f fix: use right conversion script 2023-05-11 11:20:43 -04:00
Adam Treat
8e7b96bd92 Move the llmodel C API to new top-level directory and version it. 2023-05-10 11:46:40 -04:00