Adam Treat
|
34407f1563
|
Don't set the app version in the llmodel.
|
2023-04-29 10:31:12 -04:00 |
|
Adam Treat
|
2a5b34b193
|
Load models from filepath only.
|
2023-04-28 20:15:10 -04:00 |
|
Adam Treat
|
70ab18f644
|
Update to latest llama.cpp
|
2023-04-28 11:03:16 -04:00 |
|
Adam Treat
|
a3253c4ab1
|
Move the saving of the tokens to the impl and not the callbacks responsibility.
|
2023-04-27 11:16:51 -04:00 |
|
Adam Treat
|
9a65f73392
|
Move the promptCallback to own function.
|
2023-04-27 11:08:15 -04:00 |
|
Adam Treat
|
ebf660d2bd
|
Provide an initial impl. of the C interface. NOTE: has not been tested.
|
2023-04-27 09:43:24 -04:00 |
|
Adam Treat
|
368cd8e119
|
Add this and unbreak the build.
|
2023-04-26 22:45:10 -04:00 |
|
Adam Treat
|
eafb98b3a9
|
Initial support for opt-in telemetry.
|
2023-04-26 22:05:56 -04:00 |
|
Adam Treat
|
70e6b45123
|
Don't crash when prompt is too large.
|
2023-04-26 19:08:37 -04:00 |
|
Adam Treat
|
b04ab8fb5c
|
Update llama.cpp submodule to latest.
|
2023-04-26 11:50:05 -04:00 |
|
Adam Treat
|
ebc51b3e8d
|
Clean up the docs a bit more still.
|
2023-04-26 08:22:38 -04:00 |
|
Adam Treat
|
ae7ca04408
|
Clean up the docs a bit more.
|
2023-04-26 08:22:38 -04:00 |
|
Adam Treat
|
4e5c4927fc
|
Clean up the docs a bit.
|
2023-04-26 08:22:38 -04:00 |
|
Adam Treat
|
04190e6107
|
Only need one opaque pointer.
|
2023-04-26 08:22:38 -04:00 |
|
Adam Treat
|
d86b441c5d
|
Fixup the api a bit.
|
2023-04-26 08:22:38 -04:00 |
|
Adam Treat
|
4b47478626
|
Move the backend code into own subdirectory and make it a shared library. Begin fleshing out the C api wrapper that bindings can use.
|
2023-04-26 08:22:38 -04:00 |
|