sigoden
179a5f5749
refactor: update vertexai/gemini/ernie clients ( #309 )
2024-02-16 18:32:33 +08:00
sigoden
5e4210980d
feat: support vertexai ( #308 )
2024-02-15 08:10:30 +08:00
sigoden
3bf0c371e4
feat: update openai/qianwen/gemini models ( #306 )
2024-02-13 13:07:23 +08:00
Joseph Goulden
25b27c148a
fix: do not attempt to deserialize zero byte chunks in ollama stream ( #303 )
2024-02-05 21:24:37 +08:00
Kelvie Wong
176ff6f83e
feat: add extra_fields
to models of localai/ollama clients ( #298 )
...
* Add an "extra_fields" config to localai models
Because there are so many local AIs out there with a bunch of custom
parameters you can set, this allows users to send in extra parameters to
a local LLM runner, such as, e.g. `instruction_template: Alpaca`, so
that Mixtral can take a system prompt.
* support ollama
---------
Co-authored-by: sigoden <sigoden@gmail.com>
2024-01-30 19:43:55 +08:00
sigoden
a30c3cc4c1
feat: add openai.api_base config ( #302 )
2024-01-30 18:28:12 +08:00
sigoden
fe35cfd941
feat: supports model capabilities ( #297 )
...
1. automatically switch to the model that has the necessary capabilities.
2. throw an error if the client does not have a model with the necessary capabilities
2024-01-13 19:52:07 +08:00
Nicola Coretti
4e99df4c1b
fix: deprecation warning of .read command ( #296 )
2024-01-13 08:18:04 +08:00
sigoden
665f7ff4b9
fix: copy on linux wayland ( #288 )
2023-12-28 21:52:45 +08:00
sigoden
e198e35a28
chore: update readme to remove pacman installation
2023-12-28 02:13:46 +00:00
sigoden
cb542bef08
chore: update readme
2023-12-27 01:27:59 +00:00
sigoden
860f098082
chore: update readme
2023-12-26 01:10:25 +00:00
sigoden
0bda6af4dc
chore: release v0.12.0
2023-12-26 00:28:19 +00:00
sigoden
b4ab5238ae
chore: update deps and readme
2023-12-25 00:26:42 +00:00
sigoden
0e5c8567d4
refactor: ollam api_base configuration ( #285 )
2023-12-25 08:05:06 +08:00
sigoden
1c9ca1b002
feat: custom REPL prompt ( #283 )
2023-12-24 16:04:18 +08:00
sigoden
89fefb4d1a
refactor: remove path existence indicator from info ( #282 )
2023-12-23 06:35:23 +08:00
sigoden
5b001af50d
chore: update readme
2023-12-21 00:03:37 +00:00
sigoden
1bb28665f7
chore: update readme and deps
2023-12-20 23:21:17 +00:00
sigoden
6280f5ab4b
feat: qianwen vision models support embeded images ( #277 )
2023-12-20 11:29:52 +08:00
sigoden
64c4edf7c8
feat: support ollama ( #276 )
2023-12-20 07:49:05 +08:00
sigoden
6c9d7a679e
feat: support qianwen:qwen-vl-plus ( #275 )
2023-12-19 23:10:35 +08:00
sigoden
34d58b2369
refactor: adjust order of ernie models
2023-12-19 12:55:03 +00:00
sigoden
6fb13359f4
feat: abandon PaLM2 ( #274 )
2023-12-19 20:47:15 +08:00
sigoden
6286251d32
feat: support gemini ( #273 )
2023-12-19 20:37:53 +08:00
sigoden
3adfeb1ae7
chore: change order of openai models
2023-12-16 03:15:43 +00:00
sigoden
65ab8c1d6f
chore: update issue template
2023-12-13 22:52:40 +00:00
sigoden
27cae9582f
fix: cannot read image with uppercase ext ( #270 )
2023-12-14 06:15:01 +08:00
sigoden
10e3b6d234
chore: update change session error message ( #267 )
2023-12-13 08:11:10 +08:00
sigoden
e4d301f3d7
fix: pipe failed on macos ( #264 )
2023-12-08 07:42:47 +08:00
sigoden
e7272398dd
feat: change REPL indicators ( #263 )
2023-12-08 07:03:42 +08:00
sigoden
075631d794
chore: release v0.11.0
2023-11-29 11:00:00 +08:00
sigoden
30e2fd62dc
chore: update deps
2023-11-29 10:38:16 +08:00
sigoden
eb8a150539
chore: improve readme
2023-11-28 11:59:02 +08:00
sigoden
ee5fbe629c
feat: allow shift-tab to select prev in completion menu ( #254 )
2023-11-27 19:08:33 +08:00
sigoden
0afef991f9
chore: update cli description and readme
2023-11-27 18:02:52 +08:00
sigoden
c58912ba64
refactor: sort of some complete type ( #253 )
2023-11-27 17:22:16 +08:00
sigoden
18f16c6511
feat: add ernie:ernie-bot-8k qianwen:qwen-max ( #252 )
2023-11-27 16:06:35 +08:00
sigoden
2508d56598
feat: state-aware completer ( #251 )
2023-11-27 15:39:55 +08:00
sigoden
25e545474f
chore: upgrade deps ( #250 )
2023-11-27 14:20:02 +08:00
sigoden
35c75506e2
feat: support vision ( #249 )
...
* feat: support vision
* clippy
* implement vision
* resolve data url to local file
* add model openai:gpt-4-vision-preview
* use newline to concate embeded text files
* set max_tokens for gpt-4-vision-preview
2023-11-27 14:04:50 +08:00
sigoden
5bfe95d311
refactor: trim trailing spaces from the role prompt ( #246 )
2023-11-14 11:58:56 +08:00
sigoden
8f1d8dec5f
refactor: palm client system message ( #245 )
2023-11-14 10:34:42 +08:00
sigoden
e118e864d5
refactor: ernie client system message ( #244 )
2023-11-14 10:03:01 +08:00
sigoden
9b0fbe3506
fix: the last reply tokens was not highlighted ( #243 )
2023-11-14 08:53:21 +08:00
sigoden
542dcfe6f6
feat: extend .read
to support files and messages ( #242 )
2023-11-13 19:53:24 +08:00
sigoden
757c192829
refactor: qianwen client use incremental_output ( #240 )
2023-11-09 08:55:52 +08:00
sigoden
01cf8acb81
refactor: improve code quanity ( #238 )
2023-11-08 22:20:46 +08:00
sigoden
b40659613d
feat: add a spinner to indicate waiting for response ( #236 )
2023-11-08 20:36:30 +08:00
sigoden
eb30d90391
refactor: improve render ( #235 )
...
* refactor: redesign render
- if stdout is not terminal, just write reply text to stdout
- rename repl_render_stream to markdown_stream
- deprecate cmd_render_stream
- use raw_stream to just print streaming reply text
* optimize rendering error
* optimize render_stream
2023-11-08 18:26:38 +08:00