## talk-codebase: Tool for chatting with your codebase and docs using OpenAI, LlamaCpp, and GPT-4-All [![Node.js Package](https://github.com/rsaryev/talk-codebase/actions/workflows/python-publish.yml/badge.svg)](https://github.com/rsaryev/talk-codebase/actions/workflows/python-publish.yml)
## Description Talk-codebase is a tool that allows you to converse with your codebase using LLMs to answer your queries. It supports offline code processing using [GPT4All](https://github.com/nomic-ai/gpt4all) without sharing your code with third parties, or you can use OpenAI if privacy is not a concern for you. It is only recommended for educational purposes and not for production use. ## Installation To install `talk-codebase`, you need to have Python 3.9 and an OpenAI API key [api-keys](https://platform.openai.com/account/api-keys). Additionally, if you want to use the GPT4All model, you need to download the [ggml-gpt4all-j-v1.3-groovy.bin](https://gpt4all.io/models/ggml-gpt4all-j-v1.3-groovy.bin) model. If you prefer a different model, you can download it from [GPT4All](https://gpt4all.io) and configure path to it in the configuration and specify its path in the configuration. If you want some files to be ignored, add them to .gitignore. To install `talk-codebase`, run the following command in your terminal: ```bash pip install talk-codebase ``` Once `talk-codebase` is installed, you can use it to chat with your codebase by running the following command: ```bash talk-codebase chat