mirror of
https://github.com/rsaryev/talk-codebase
synced 2024-11-12 01:10:40 +00:00
f9a31937bb
- Refactored the CLI and LLM classes to improve code organization and readability. - Added a function to create an LLM instance based on the config. - Moved the function to the and classes. - Added a function to handle loading an existing vector store. - Added a function to estimate the cost of creating a vector store for OpenAI models. - Updated the function to prompt for the model type and path or API key depending on the type. - Updated the function to use the function and method of the LLM instance. - Updated the default config to include default values for and . - Added a constant to store the default config values. - Added a constant to store the default model path. |
||
---|---|---|
.github | ||
talk_codebase | ||
.DS_Store | ||
.gitignore | ||
poetry.lock | ||
pyproject.toml | ||
README.md | ||
requirements.txt |
talk-codebase: A Tool for Chatting with Your Codebase
Description
Talk-codebase is a powerful tool that allows you to converse with your codebase. It uses LLMs to answer your queries.
You can use GPT4All for offline code processing without sharing your code with third parties. Alternatively, you can use OpenAI if privacy is not a concern for you. You can switch between these two options quickly and easily.
Installation
pip install talk-codebase
Usage
Talk-codebase works only with files of popular programming languages and .txt files. All other files will be ignored.
# Start chatting with your codebase
talk-codebase chat <directory>
# Configure or edit configuration ~/.config.yaml
talk-codebase configure
# Help
talk-codebase --help
Requirements
- Python 3.9
- OpenAI API key api-keys
- If you want to use GPT4All, you need to download the model ggml-gpt4all-j-v1.3-groovy.bin and specify the path to it in the configuration.