667b66b926
Public LLM |
||
---|---|---|
.github | ||
application | ||
Assets | ||
docs | ||
extensions | ||
frontend | ||
mock-backend | ||
scripts | ||
tests | ||
.env-template | ||
.gitignore | ||
.ruff.toml | ||
CODE_OF_CONDUCT.md | ||
codecov.yml | ||
CONTRIBUTING.md | ||
docker-compose-azure.yaml | ||
docker-compose-dev.yaml | ||
docker-compose-local.yaml | ||
docker-compose-mock.yaml | ||
docker-compose.yaml | ||
LICENSE | ||
package-lock.json | ||
package.json | ||
Readme Logo.png | ||
README.md | ||
run-with-docker-compose.sh | ||
setup.sh |
DocsGPT 🦖
Open-Source Documentation Assistant
DocsGPT is a cutting-edge open-source solution that streamlines the process of finding information in the project documentation. With its integration of the powerful GPT models, developers can easily ask questions about a project and receive accurate answers.
Say goodbye to time-consuming manual searches, and let DocsGPT help you quickly find the information you need. Try it out and see how it revolutionizes your project documentation experience. Contribute to its development and be a part of the future of AI-powered assistance.
📣🎅❄️ Special Seasonal 🎄🌟 Reward 🎁☃️ for Contributions.
Starting from 25th December for the next 3 weeks, contribute a meaningful PR and earn a special holopin! This is an amazing opportunity not only to hone your skills but also to make memorable contributions.
Production Support / Help for Companies:
We're eager to provide personalized assistance when deploying your DocsGPT to a live environment.
Roadmap
You can find our roadmap here. Please don't hesitate to contribute or create issues, it helps us improve DocsGPT!
Our Open-Source Models Optimized for DocsGPT:
Name | Base Model | Requirements (or similar) |
---|---|---|
Docsgpt-7b-falcon | Falcon-7b | 1xA10G gpu |
Docsgpt-14b | llama-2-14b | 2xA10 gpu's |
Docsgpt-40b-falcon | falcon-40b | 8xA10G gpu's |
If you don't have enough resources to run it, you can use bitsnbytes to quantize.
Features
Useful Links
-
🔍 🔥 Live preview
-
💬 🎉 Join our Discord
-
📚 😎 Guides
-
🏠 🔐 How to host it locally (so all data will stay on-premises)
Project Structure
-
Application - Flask app (main application).
-
Extensions - Chrome extension.
-
Scripts - Script that creates similarity search index for other libraries.
QuickStart
Note
Make sure you have Docker installed
On Mac OS or Linux, write:
./setup.sh
It will install all the dependencies and allow you to download the local model, use OpenAI or use our LLM API.
Otherwise, refer to this Guide:
-
Download and open this repository with
git clone https://github.com/arc53/DocsGPT.git
-
Create a
.env
file in your root directory and set the env variables andVITE_API_STREAMING
to true or false, depending on whether you want streaming answers or not. It should look like this inside:LLM_NAME=[docsgpt or openai or others] VITE_API_STREAMING=true API_KEY=[if LLM_NAME is openai]
See optional environment variables in the /.env-template and /application/.env_sample files.
-
Navigate to http://localhost:5173/.
To stop, just run Ctrl + C
.
Development Environments
Spin up Mongo and Redis
For development, only two containers are used from docker-compose.yaml (by deleting all services except for Redis and Mongo). See file docker-compose-dev.yaml.
Run
docker compose -f docker-compose-dev.yaml build
docker compose -f docker-compose-dev.yaml up -d
Run the Backend
Note
Make sure you have Python 3.10 or 3.11 installed.
- Export required environment variables or prepare a
.env
file in the/application
folder:- Copy .env_sample and create
.env
.
- Copy .env_sample and create
(check out application/core/settings.py
if you want to see more config options.)
- (optional) Create a Python virtual environment: You can follow the Python official documentation for virtual environments.
a) On Mac OS and Linux
python -m venv venv
. venv/bin/activate
b) On Windows
python -m venv venv
venv/Scripts/activate
- Download embedding model and save it in the
model/
folder: You can use the script below, or download it manually from here, unzip it and save it in themodel/
folder.
wget https://d3dg1063dc54p9.cloudfront.net/models/embeddings/mpnet-base-v2.zip
unzip mpnet-base-v2.zip -d model
rm mpnet-base-v2.zip
4. Change to the `application/` subdir by the command `cd application/` and install dependencies for the backend:
```commandline
pip install -r requirements.txt
- Run the app using
flask --app application/app.py run --host=0.0.0.0 --port=7091
. - Start worker with
celery -A application.app.celery worker -l INFO
.
Start Frontend
Note
Make sure you have Node version 16 or higher.
- Navigate to the /frontend folder.
- Install the required packages
husky
andvite
(ignore if already installed).
npm install husky -g
npm install vite -g
- Install dependencies by running
npm install --include=dev
. - Run the app using
npm run dev
.
Contributing
Please refer to the CONTRIBUTING.md file for information about how to get involved. We welcome issues, questions, and pull requests.
Code Of Conduct
We as members, contributors, and leaders, pledge to make participation in our community a harassment-free experience for everyone, regardless of age, body size, visible or invisible disability, ethnicity, sex characteristics, gender identity and expression, level of experience, education, socio-economic status, nationality, personal appearance, race, religion, or sexual identity and orientation. Please refer to the CODE_OF_CONDUCT.md file for more information about contributing.
Many Thanks To Our Contributors⚡
License
The source code license is MIT, as described in the LICENSE file.
Built with 🐦 🔗 LangChain