Go to file
2024-06-27 08:21:49 +08:00
.github chore: update bug_report issue-template 2024-05-15 07:17:49 +00:00
assets refactor: rename model.mode to model.type (#625) 2024-06-21 21:26:18 +08:00
scripts refactor: cli help and shell completions (#642) 2024-06-24 07:37:50 +08:00
src refactor: saved messages ignore embedding context (#659) 2024-06-27 08:21:49 +08:00
.gitignore feat: support function calling (#514) 2024-05-18 19:06:21 +08:00
Argcfile.sh refactor: improve azure-openai (#652) 2024-06-26 13:48:55 +08:00
Cargo.lock feat: custom rag document loaders (#650) 2024-06-26 08:18:58 +08:00
Cargo.toml feat: custom rag document loaders (#650) 2024-06-26 08:18:58 +08:00
config.example.yaml refactor: enable function_calling by default (#656) 2024-06-26 23:04:06 +08:00
LICENSE-APACHE chore: add LICENSE 2023-03-02 21:19:53 +08:00
LICENSE-MIT chore: update LICENSE-MIT 2024-03-06 01:26:41 +00:00
models.yaml refactor: rename model type rerank to reranker (#646) 2024-06-25 07:50:19 +08:00
README.md chore: update readme 2024-06-25 04:57:45 +00:00

AIChat: All-in-one AI CLI Tool

CI Crates Discord

AIChat is an all-in-one AI CLI tool featuring Chat-REPL, Shell Assistant, RAG, Tool Use, AI Agent, and More.

Install

Package Managers

  • Rust Developers: cargo install aichat
  • Homebrew/Linuxbrew Users: brew install aichat
  • Pacman Users: yay -S aichat
  • Windows Scoop Users: scoop install aichat
  • Android Termux Users: pkg install aichat

Pre-built Binaries

Download pre-built binaries for macOS, Linux, and Windows from GitHub Releases, extract them, and add the aichat binary to your $PATH.

Get Started

Upon its first launch after installation, AIChat will guide you through the initialization of the configuration file.

aichat-init-config

You can tailor AIChat to your preferences by editing the configuration file.

The config.example.yaml file provides a comprehensive list of all configuration options with detailed explanations.

Features

20+ Platforms

AIChat offers users a wide and diverse selection of Large Language Models (LLMs):

  • OpenAI: GPT-4/GPT-3.5 (paid, vision, embedding, function-calling)
  • Gemini: Gemini-1.5/Gemini-1.0 (free, paid, vision, embedding, function-calling)
  • Claude: Claude-3.5/Claude-3 (paid, vision, function-calling)
  • Ollama: (free, local, embedding)
  • Groq: Llama-3/Mixtral/Gemma (free, function-calling)
  • Azure-OpenAI: GPT-4/GPT-3.5 (paid, vision, embedding, function-calling)
  • VertexAI: Gemini-1.5/Gemini-1.0 (paid, vision, embedding, function-calling)
  • VertexAI-Claude: Claude-3.5/Claude-3 (paid, vision)
  • Bedrock: Llama-3/Claude-3.5/Claude-3/Mistral (paid, vision)
  • Mistral (paid, embedding, function-calling)
  • Cohere: Command-R/Command-R+ (paid, embedding, reranker, function-calling)
  • Perplexity: Llama-3/Mixtral (paid)
  • Cloudflare: (free, vision, embedding)
  • OpenRouter: (paid, vision, function-calling)
  • Replicate: (paid)
  • Ernie: (paid, embedding, reranker, function-calling)
  • Qianwen: Qwen (paid, vision, embedding, function-calling)
  • Moonshot: (paid, function-calling)
  • Deepseek: (paid)
  • ZhipuAI: GLM-4 (paid, vision, function-calling)
  • LingYiWanWu: Yi-* (paid, vision)
  • OpenAI-Compatible Platforms

Shell Assistant

Simply input what you want to do in natural language, and aichat will prompt and run the command that achieves your intent.

aichat-execute

AIChat is aware of OS and shell you are using, it will provide shell command for specific system you have.

Role

Customizable roles allow users to tailor the behavior of LLMs, enhancing productivity and ensuring the tool aligns with specific needs and workflows.

aichat-role

Session

By default, AIChat behaves in a one-off request/response manner. With sessions, AIChat conducts context-aware conversations.

aichat-session

Retrieval-Augmented Generation (RAG)

Seamlessly integrates document interactions into your chat experience.

aichat-rag

Function Calling

Function calling supercharges LLMs by connecting them to external tools and data sources. This unlocks a world of possibilities, enabling LLMs to go beyond their core capabilities and tackle a wider range of tasks.

We have created a new repository https://github.com/sigoden/llm-functions to help you make the most of this feature.

Tool Use

Here's a glimpse of How to use the tools.

aichat-tool-use

AI Agent

Agent = Prompt (Role) + Tools (Function Callings) + Knowndge (RAG). It's also known as OpenAI's GPTs.

Here's a glimpse of how to use the agents.

aichat-agent

Local Server

AIChat comes with a built-in lightweight http server.

$ aichat --serve
Chat Completions API: http://127.0.0.1:8000/v1/chat/completions
Embeddings API:       http://127.0.0.1:8000/v1/embeddings
LLM Playground:       http://127.0.0.1:8000/playground
LLM Arena:            http://127.0.0.1:8000/arena?num=2

Proxy LLM APIs

AIChat offers the ability to function as a proxy server for all LLMs. This allows you to interact with different LLMs using the familiar OpenAI API format, simplifying the process of accessing and utilizing these LLMs.

Test with curl:

curl -X POST -H "Content-Type: application/json" -d '{
  "model":"claude:claude-3-opus-20240229",
  "messages":[{"role":"user","content":"hello"}], 
  "stream":true
}' http://127.0.0.1:8000/v1/chat/completions

LLM Playground

The LLM Playground is a webapp that allows you to interact with any LLM supported by AIChat directly in your browser.

aichat-llm-playground

LLM Arena

The LLM Arena is a web-based platform where you can compare different LLMs side-by-side.

aichat-llm-arena

Custom Themes

AIChat supports custom dark and light themes, which highlight response text and code blocks.

aichat-themes

Wiki

License

Copyright (c) 2023-2024 aichat-developers.

AIChat is made available under the terms of either the MIT License or the Apache License 2.0, at your option.

See the LICENSE-APACHE and LICENSE-MIT files for license details.