.github/workflows | ||
application | ||
docs | ||
extensions | ||
frontend | ||
scripts | ||
tests | ||
.env-template | ||
.gitignore | ||
.ruff.toml | ||
CODE_OF_CONDUCT.md | ||
codecov.yml | ||
CONTRIBUTING.md | ||
docker-compose-azure.yaml | ||
docker-compose-dev.yaml | ||
docker-compose.yaml | ||
HACKTOBERFEST.md | ||
LICENSE | ||
Readme Logo.png | ||
README.md | ||
run-with-docker-compose.sh | ||
setup.sh |
DocsGPT 🦖
Open-Source Documentation Assistant
DocsGPT is a cutting-edge open-source solution that streamlines the process of finding information in project documentation. With its integration of the powerful GPT models, developers can easily ask questions about a project and receive accurate answers.
Say goodbye to time-consuming manual searches, and let DocsGPT help you quickly find the information you need. Try it out and see how it revolutionizes your project documentation experience. Contribute to its development and be a part of the future of AI-powered assistance.
Enterprise Solutions:
When deploying your DocsGPT to a live environment, we're eager to provide personalized assistance. Reach out to us via email here to discuss your project further, and our team will connect with you shortly.
🎉 Join the Hacktoberfest with DocsGPT and Earn a Free T-shirt! 🎉
Roadmap
You can find our Roadmap here. Please don't hesitate to contribute or create issues, it helps us make DocsGPT better!
Our open source models optimised for DocsGPT:
Name | Base Model | Requirements (or similar) |
---|---|---|
Docsgpt-7b-falcon | Falcon-7b | 1xA10G gpu |
Docsgpt-14b | llama-2-14b | 2xA10 gpu's |
Docsgpt-40b-falcon | falcon-40b | 8xA10G gpu's |
If you don't have enough resources to run it you can use bitsnbytes to quantize
Features
Useful links
How to use any other documentation
How to host it locally (so all data will stay on-premises)
Project structure
-
Application - Flask app (main application)
-
Extensions - Chrome extension
-
Scripts - Script that creates similarity search index and store for other libraries.
-
Frontend - Frontend uses Vite and React
QuickStart
Note: Make sure you have Docker installed
-
Download and open this repository with
git clone https://github.com/arc53/DocsGPT.git
-
Create a .env file in your root directory and set the env variable OPENAI_API_KEY with your OpenAI API key and VITE_API_STREAMING to true or false, depending on if you want streaming answers or not It should look like this inside:
OPENAI_API_KEY=Yourkey VITE_API_STREAMING=true SELF_HOSTED_MODEL=false
See optional environment variables in the
/.env-template
and/application/.env_sample
files. -
Run
./run-with-docker-compose.sh
-
Navigate to http://localhost:5173/
To stop just run Ctrl + C
Development environments
Spin up mongo and redis
For development only 2 containers are used from docker-compose.yaml (by deleting all services except for Redis and Mongo). See file docker-compose-dev.yaml.
Run
docker compose -f docker-compose-dev.yaml build
docker compose -f docker-compose-dev.yaml up -d
Run the backend
Make sure you have Python 3.10 or 3.11 installed.
- Export required environment variables
export CELERY_BROKER_URL=redis://localhost:6379/0
export CELERY_RESULT_BACKEND=redis://localhost:6379/1
export MONGO_URI=mongodb://localhost:27017/docsgpt
export FLASK_APP=application/app.py
export FLASK_DEBUG=true
- Prepare .env file
Copy
.env_sample
and create.env
with your OpenAI API token - (optional) Create a Python virtual environment
python -m venv venv
. venv/bin/activate
- Change to
application/
subdir and install dependencies for the backend
pip install -r application/requirements.txt
- Run the app
flask run --host=0.0.0.0 --port=7091
- Start worker with
celery -A application.app.celery worker -l INFO
Start frontend
Make sure you have Node version 16 or higher.
- Navigate to
/frontend
folder - Install dependencies
npm install
- Run the app
npm run dev
Built with 🦜️🔗 LangChain