sigoden
7762cd6bed
refactor: model pass_max_tokens ( #493 )
2024-05-08 13:46:26 +08:00
sigoden
956a960390
feat: support zhipuai client ( #491 )
2024-05-07 16:40:18 +08:00
sigoden
0071d84aa5
feat: support deepseek client ( #490 )
2024-05-07 16:16:18 +08:00
sigoden
9b283024b4
feat: extract vertexai-claude client ( #485 )
2024-05-06 08:19:42 +08:00
sigoden
82a9ef024d
refactor: aware of lowercase https_proxy/all_proxy ( #484 )
2024-05-05 07:04:53 +08:00
sigoden
5eae392dbd
refactore: add models for openai-compatible platforms ( #471 )
2024-05-01 06:01:10 +08:00
sigoden
83ca74bf8a
chore: minor refinement
2024-04-30 05:16:45 +00:00
sigoden
8dba46becf
feat: openai-compatible platforms share the same client ( #469 )
2024-04-30 12:52:58 +08:00
sigoden
50eac8b594
feat: support replicate client ( #466 )
2024-04-30 07:07:09 +08:00
sigoden
ffb0af8236
refactor: add some openai-compatiable platforms to config.example.yaml ( #464 )
2024-04-29 20:08:59 +08:00
sigoden
4d1c53384b
refactor: prompts for generating config file ( #463 )
2024-04-29 19:28:54 +08:00
sigoden
3a00fb283e
refactor: user config models replace client builtin models
2024-04-29 05:24:10 +00:00
sigoden
34041a976c
feat: support cloudflare client ( #459 )
2024-04-29 09:27:11 +08:00
sigoden
72d1e5958d
chore: add openai-compatible platforms to config.example.yaml
2024-04-28 07:34:31 +00:00
sigoden
212ff1674e
refactor: minor improvements
...
- config.example.yaml comments
- `--serve` description
- No model error
2024-04-28 06:43:17 +00:00
sigoden
7bda1eace2
refactor: rename ollama config field api_key => api_auth ( #453 )
2024-04-28 13:54:51 +08:00
sigoden
338b0438dc
feat: run without config file by set AICHAT_CLIENT_TYPE
( #452 )
2024-04-28 13:28:24 +08:00
sigoden
1f2b626703
feat: support bedrock client ( #450 )
2024-04-28 11:27:06 +08:00
sigoden
615bab215b
feat: support vertexai claude ( #439 )
2024-04-28 10:55:41 +08:00
sigoden
a21e1213cc
feat: support perplexity ( #444 )
2024-04-26 06:57:23 +08:00
sigoden
503ec98bd1
feat: support moonshot again ( #442 )
2024-04-25 21:19:04 +08:00
sigoden
29a4ffd514
feat: support groq ( #441 )
2024-04-25 20:59:56 +08:00
sigoden
b62e8e1625
refactor: update ollama chat_endpoint in example config file
2024-04-24 13:00:05 +00:00
sigoden
a17f349daa
feat: support customizing top_p
parameter ( #434 )
2024-04-24 16:12:38 +08:00
sigoden
d1aafa1115
feat: customize model's max_output_tokens ( #428 )
2024-04-23 16:46:48 +08:00
sigoden
8ab57ed74e
refactor: update description for cli options and config fields ( #423 )
2024-04-20 21:07:30 +08:00
sigoden
8b806db857
feat: ctrlc won't exit repl and remove ctrlc_exit
config ( #419 )
2024-04-20 07:53:31 +08:00
sigoden
5d7c786b7f
feat: remove moonshot client ( #418 )
2024-04-19 10:55:08 +08:00
sigoden
b9bde15c1f
feat: add config buffer_editor
( #417 )
2024-04-18 18:22:15 +08:00
sigoden
4a88a3da04
refactor: ctrlc_exit
config default to false ( #403 )
2024-04-11 08:14:08 +08:00
sigoden
5915bc2f3a
feat: support cohere ( #397 )
2024-04-10 18:52:21 +08:00
sigoden
2738988fc4
feat: add ctrlc_exit
option to control REPL exit by ctrl+c ( #391 )
2024-04-09 07:17:02 +08:00
sigoden
bd9a6a8725
feat: save_session
config item can be null ( #380 )
2024-03-27 10:02:09 +08:00
sigoden
8da9fa5f4c
feat: add sepereate save_session
config item to session ( #377 )
2024-03-27 07:33:21 +08:00
Patrick Jackson
582f56e915
feat: add save_session
config item and --save-session
cli option ( #370 )
...
* fix: sessions should save when exiting
* feat: improve save sessions
* feat: do not allow saving the temp session name
* feat: allow creating a session without interactive use
* feat: add `save_session` config and `--save-session` option
---------
Co-authored-by: sigoden <sigoden@gmail.com>
2024-03-27 07:00:28 +08:00
sigoden
7f05dc1a4a
feat: support customizing gemini safeSettings ( #375 )
2024-03-25 21:06:35 +08:00
sigoden
0ebc7955da
refactor: improve creating config for openai-compatible client ( #374 )
2024-03-25 11:13:54 +08:00
sigoden
eec041c111
feat: rename client localai to openai-compatible ( #373 )
...
BREAKING CHANGE: rename client localai to openai-compatible
2024-03-25 10:52:05 +08:00
sigoden
774d991144
feat: support moonshot ( #369 )
2024-03-25 08:23:54 +08:00
sigoden
527da63d18
refactor: do not automatically convert temperature value ( #368 )
2024-03-22 19:24:22 +08:00
sigoden
d5b10ea0c1
chore: fix typos
2024-03-07 12:31:06 +00:00
sigoden
8e5d4e55b1
refactor: rename model's max_tokens
to max_input_tokens
( #339 )
...
BREAKING CHANGE: rename model's `max_tokens` to `max_input_tokens`
2024-03-06 08:35:40 +08:00
sigoden
3f693ea060
feat: compress session automaticlly ( #333 )
...
* feat: compress session automaticlly
* non-block
* update field description
* set compress_threshold
* update session::clear_messages
* able to override session compress_threshold
* enable compress_threshold by default
* make session compress_threshold optional
2024-03-04 11:08:59 +08:00
sigoden
75fe0b9205
feat: support mistral ( #324 )
2024-02-28 08:22:15 +08:00
sigoden
c538533014
feat: support claude ( #278 )
2024-02-27 08:38:19 +08:00
sigoden
179a5f5749
refactor: update vertexai/gemini/ernie clients ( #309 )
2024-02-16 18:32:33 +08:00
sigoden
5e4210980d
feat: support vertexai ( #308 )
2024-02-15 08:10:30 +08:00
Kelvie Wong
176ff6f83e
feat: add extra_fields
to models of localai/ollama clients ( #298 )
...
* Add an "extra_fields" config to localai models
Because there are so many local AIs out there with a bunch of custom
parameters you can set, this allows users to send in extra parameters to
a local LLM runner, such as, e.g. `instruction_template: Alpaca`, so
that Mixtral can take a system prompt.
* support ollama
---------
Co-authored-by: sigoden <sigoden@gmail.com>
2024-01-30 19:43:55 +08:00
sigoden
a30c3cc4c1
feat: add openai.api_base config ( #302 )
2024-01-30 18:28:12 +08:00
sigoden
fe35cfd941
feat: supports model capabilities ( #297 )
...
1. automatically switch to the model that has the necessary capabilities.
2. throw an error if the client does not have a model with the necessary capabilities
2024-01-13 19:52:07 +08:00