33 KiB
Written by @xtekky & maintained by @hlohaus
By using this repository or any code related to it, you agree to the legal notice. The author is not responsible for the usage of this repository nor endorses it, nor is the author responsible for any copies, forks, re-uploads made by other users, or anything else related to GPT4Free. This is the author's only account and repository. To prevent impersonation or irresponsible actions, please comply with the GNU GPL license this Repository uses.
Warning
"gpt4free" serves as a PoC (proof of concept), demonstrating the development of an API package with multi-provider requests, with features like timeouts, load balance and flow control.
Note
pip install -U g4f
docker pull hlohaus789/g4f
🆕 What's New
- Join our Telegram Channel: t.me/g4f_channel
- Join our Discord Group: discord.gg/XfybzPXPH5
g4f
now supports 100% local inference: local-docs
🔻 Site Takedown
Is your site on this repository and you want to take it down? Send an email to takedown@g4f.ai with proof it is yours and it will be removed as fast as possible. To prevent reproduction please secure your API ;)
🚀 Feedback and Todo
You can always leave some feedback here: https://forms.gle/FeWV9RLEedfdkmFN6
As per the survey, here is a list of improvements to come
- Update the repository to include the new openai library syntax (ex:
Openai()
class) | completed, useg4f.client.Client
- Golang implementation
- 🚧 Improve Documentation (in /docs & Guides, Howtos, & Do video tutorials)
- Improve the provider status list & updates
- Tutorials on how to reverse sites to write your own wrapper (PoC only ofc)
- Improve the Bing wrapper. (might write a new wrapper in golang as it is very fast)
- Write a standard provider performance test to improve the stability
- Potential support and development of local models
- 🚧 Improve compatibility and error handling
📚 Table of Contents
- 🆕 What's New
- 📚 Table of Contents
- 🛠️ Getting Started
- 💡 Usage
- 🚀 Providers and Models
- 🔗 Related GPT4Free Projects
- 🤝 Contribute
- 🙌 Contributors
- ©️ Copyright
- ⭐ Star History
- 📄 License
🛠️ Getting Started
Docker container
Quick start:
- Download and install Docker
- Pull latest image and run the container:
docker pull hlohaus789/g4f
docker run -p 8080:8080 -p 1337:1337 -p 7900:7900 --shm-size="2g" hlohaus789/g4f:latest
- Open the included client on: http://localhost:8080/chat/ or set the API base in your client to: http://localhost:1337/v1
- (Optional) If you need to log in to a provider, you can view the desktop from the container here: http://localhost:7900/?autoconnect=1&resize=scale&password=secret.
Use your smartphone:
Run the Web UI on Your Smartphone:
Use python
Prerequisites:
- Download and install Python (Version 3.10+ is recommended).
- Install Google Chrome for providers with webdriver
Install using PyPI package:
pip install -U g4f[all]
How do I install only parts or do disable parts? Use partial requirements: /docs/requirements
Install from source:
How do I load the project using git and installing the project requirements? Read this tutorial and follow it step by step: /docs/git
Install using Docker:
How do I build and run composer image from source? Use docker-compose: /docs/docker
💡 Usage
Text Generation
from g4f.client import Client
client = Client()
response = client.chat.completions.create(
model="gpt-3.5-turbo",
messages=[{"role": "user", "content": "Hello"}],
...
)
print(response.choices[0].message.content)
Hello! How can I assist you today?
Image Generation
from g4f.client import Client
client = Client()
response = client.images.generate(
model="gemini",
prompt="a white siamese cat",
...
)
image_url = response.data[0].url
Full Documentation for Python API
- New Client API like the OpenAI Python library: /docs/client
- Legacy API with python modules: /docs/legacy
Web UI
To start the web interface, type the following codes in python:
from g4f.gui import run_gui
run_gui()
or execute the following command:
python -m g4f.cli gui -port 8080 -debug
Interference API
You can use the Interference API to serve other OpenAI integrations with G4F.
See: /docs/interference
Configuration
Cookies / Access Token
For generating images with Bing and for the OpenAI Chat you need cookies or a token from your browser session. From Bing you need the "_U" cookie and from OpenAI you need the "access_token". You can pass the cookies / the access token in the create function or you use the set_cookies
setter before you run G4F:
from g4f.cookies import set_cookies
set_cookies(".bing.com", {
"_U": "cookie value"
})
set_cookies("chat.openai.com", {
"access_token": "token value"
})
set_cookies(".google.com", {
"__Secure-1PSID": "cookie value"
})
...
Alternatively, G4F reads the cookies with browser_cookie3
from your browser
or it starts a browser instance with selenium webdriver
for logging in.
Using Proxy
If you want to hide or change your IP address for the providers, you can set a proxy globally via an environment variable:
- On macOS and Linux:
export G4F_PROXY="http://host:port"
- On Windows:
set G4F_PROXY=http://host:port
🚀 Providers and Models
GPT-4
Website | Provider | GPT-3.5 | GPT-4 | Stream | Status | Auth |
---|---|---|---|---|---|---|
bing.com | g4f.Provider.Bing |
❌ | ✔️ | ✔️ | ❌ | |
chatgpt.ai | g4f.Provider.ChatgptAi |
❌ | ✔️ | ✔️ | ❌ | |
liaobots.site | g4f.Provider.Liaobots |
✔️ | ✔️ | ✔️ | ❌ | |
chat.openai.com | g4f.Provider.OpenaiChat |
✔️ | ✔️ | ✔️ | ✔️ | |
raycast.com | g4f.Provider.Raycast |
✔️ | ✔️ | ✔️ | ✔️ | |
beta.theb.ai | g4f.Provider.Theb |
✔️ | ✔️ | ✔️ | ❌ | |
you.com | g4f.Provider.You |
✔️ | ✔️ | ✔️ | ❌ |
GPT-3.5
Website | Provider | GPT-3.5 | GPT-4 | Stream | Status | Auth |
---|---|---|---|---|---|---|
chat3.aiyunos.top | g4f.Provider.AItianhuSpace |
✔️ | ❌ | ✔️ | ❌ | |
chatforai.store | g4f.Provider.ChatForAi |
✔️ | ❌ | ✔️ | ❌ | |
chatgpt4online.org | g4f.Provider.Chatgpt4Online |
✔️ | ❌ | ✔️ | ❌ | |
chatgpt-free.cc | g4f.Provider.ChatgptNext |
✔️ | ❌ | ✔️ | ❌ | |
chatgptx.de | g4f.Provider.ChatgptX |
✔️ | ❌ | ✔️ | ❌ | |
flowgpt.com | g4f.Provider.FlowGpt |
✔️ | ❌ | ✔️ | ❌ | |
freegptsnav.aifree.site | g4f.Provider.FreeGpt |
✔️ | ❌ | ✔️ | ❌ | |
gpttalk.ru | g4f.Provider.GptTalkRu |
✔️ | ❌ | ✔️ | ❌ | |
koala.sh | g4f.Provider.Koala |
✔️ | ❌ | ✔️ | ❌ | |
app.myshell.ai | g4f.Provider.MyShell |
✔️ | ❌ | ✔️ | ❌ | |
perplexity.ai | g4f.Provider.PerplexityAi |
✔️ | ❌ | ✔️ | ❌ | |
poe.com | g4f.Provider.Poe |
✔️ | ❌ | ✔️ | ✔️ | |
talkai.info | g4f.Provider.TalkAi |
✔️ | ❌ | ✔️ | ❌ | |
chat.vercel.ai | g4f.Provider.Vercel |
✔️ | ❌ | ✔️ | ❌ | |
aitianhu.com | g4f.Provider.AItianhu |
✔️ | ❌ | ✔️ | ❌ | |
chatgpt.bestim.org | g4f.Provider.Bestim |
✔️ | ❌ | ✔️ | ❌ | |
chatbase.co | g4f.Provider.ChatBase |
✔️ | ❌ | ✔️ | ❌ | |
chatgptdemo.info | g4f.Provider.ChatgptDemo |
✔️ | ❌ | ✔️ | ❌ | |
chat.chatgptdemo.ai | g4f.Provider.ChatgptDemoAi |
✔️ | ❌ | ✔️ | ❌ | |
chatgptfree.ai | g4f.Provider.ChatgptFree |
✔️ | ❌ | ❌ | ❌ | |
chatgptlogin.ai | g4f.Provider.ChatgptLogin |
✔️ | ❌ | ✔️ | ❌ | |
chat.3211000.xyz | g4f.Provider.Chatxyz |
✔️ | ❌ | ✔️ | ❌ | |
gpt6.ai | g4f.Provider.Gpt6 |
✔️ | ❌ | ✔️ | ❌ | |
gptchatly.com | g4f.Provider.GptChatly |
✔️ | ❌ | ❌ | ❌ | |
ai18.gptforlove.com | g4f.Provider.GptForLove |
✔️ | ❌ | ✔️ | ❌ | |
gptgo.ai | g4f.Provider.GptGo |
✔️ | ❌ | ✔️ | ❌ | |
gptgod.site | g4f.Provider.GptGod |
✔️ | ❌ | ✔️ | ❌ | |
onlinegpt.org | g4f.Provider.OnlineGpt |
✔️ | ❌ | ✔️ | ❌ |
Other
Website | Provider | GPT-3.5 | GPT-4 | Stream | Status | Auth |
---|---|---|---|---|---|---|
openchat.team | g4f.Provider.Aura |
❌ | ❌ | ✔️ | ❌ | |
bard.google.com | g4f.Provider.Bard |
❌ | ❌ | ❌ | ✔️ | |
deepinfra.com | g4f.Provider.DeepInfra |
❌ | ❌ | ✔️ | ❌ | |
free.chatgpt.org.uk | g4f.Provider.FreeChatgpt |
❌ | ❌ | ✔️ | ❌ | |
gemini.google.com | g4f.Provider.Gemini |
❌ | ❌ | ✔️ | ✔️ | |
ai.google.dev | g4f.Provider.GeminiPro |
❌ | ❌ | ✔️ | ✔️ | |
gemini-chatbot-sigma.vercel.app | g4f.Provider.GeminiProChat |
❌ | ❌ | ✔️ | ❌ | |
huggingface.co | g4f.Provider.HuggingChat |
❌ | ❌ | ✔️ | ❌ | |
huggingface.co | g4f.Provider.HuggingFace |
❌ | ❌ | ✔️ | ❌ | |
llama2.ai | g4f.Provider.Llama2 |
❌ | ❌ | ✔️ | ❌ | |
labs.perplexity.ai | g4f.Provider.PerplexityLabs |
❌ | ❌ | ✔️ | ❌ | |
pi.ai | g4f.Provider.Pi |
❌ | ❌ | ✔️ | ❌ | |
theb.ai | g4f.Provider.ThebApi |
❌ | ❌ | ❌ | ✔️ | |
open-assistant.io | g4f.Provider.OpenAssistant |
❌ | ❌ | ✔️ | ✔️ |
Models
Model | Base Provider | Provider | Website |
---|---|---|---|
gpt-3.5-turbo | OpenAI | 5+ Providers | openai.com |
gpt-4 | OpenAI | 2+ Providers | openai.com |
gpt-4-turbo | OpenAI | g4f.Provider.Bing | openai.com |
Llama-2-7b-chat-hf | Meta | 2+ Providers | llama.meta.com |
Llama-2-13b-chat-hf | Meta | 2+ Providers | llama.meta.com |
Llama-2-70b-chat-hf | Meta | 3+ Providers | llama.meta.com |
CodeLlama-34b-Instruct-hf | Meta | 2+ Providers | llama.meta.com |
CodeLlama-70b-Instruct-hf | Meta | 2+ Providers | llama.meta.com |
Mixtral-8x7B-Instruct-v0.1 | Huggingface | 4+ Providers | huggingface.co |
Mistral-7B-Instruct-v0.1 | Huggingface | 4+ Providers | huggingface.co |
dolphin-2.6-mixtral-8x7b | Huggingface | g4f.Provider.DeepInfra | huggingface.co |
lzlv_70b_fp16_hf | Huggingface | g4f.Provider.DeepInfra | huggingface.co |
airoboros-70b | Huggingface | g4f.Provider.DeepInfra | huggingface.co |
airoboros-l2-70b-gpt4-1.4.1 | Huggingface | g4f.Provider.DeepInfra | huggingface.co |
openchat_3.5 | Huggingface | 2+ Providers | huggingface.co |
gemini | g4f.Provider.Gemini | gemini.google.com | |
gemini-pro | 2+ Providers | gemini.google.com | |
claude-v2 | Anthropic | 1+ Providers | anthropic.com |
claude-3-opus | Anthropic | g4f.Provider.You | anthropic.com |
claude-3-sonnet | Anthropic | g4f.Provider.You | anthropic.com |
pi | Inflection | g4f.Provider.Pi | inflection.ai |
🔗 Related GPT4Free Projects
🎁 Projects | ⭐ Stars | 📚 Forks | 🛎 Issues | 📬 Pull requests |
gpt4free | ||||
gpt4free-ts | ||||
Free AI API's & Potential Providers List | ||||
ChatGPT-Clone | ||||
ChatGpt Discord Bot | ||||
chatGPT-discord-bot | ||||
Nyx-Bot (Discord) | ||||
LangChain gpt4free | ||||
ChatGpt Telegram Bot | ||||
ChatGpt Line Bot | ||||
Action Translate Readme | ||||
Langchain Document GPT | ||||
python-tgpt |
🤝 Contribute
We welcome contributions from the community. Whether you're adding new providers or features, or simply fixing typos and making small improvements, your input is valued. Creating a pull request is all it takes – our co-pilot will handle the code review process. Once all changes have been addressed, we'll merge the pull request into the main branch and release the updates at a later time.
Guide: How do i create a new Provider?
Guide: How can AI help me with writing code?
- Read: /docs/guides/help_me
🙌 Contributors
A list of all contributors is available here
The Vercel.py
file contains code from vercel-llm-api by @ading2210, which is licensed under the GNU GPL v3
Top 1 Contributor: @hlohaus
©️ Copyright
This program is licensed under the GNU GPL v3
xtekky/gpt4free: Copyright (C) 2023 xtekky
This program is free software: you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
You should have received a copy of the GNU General Public License
along with this program. If not, see <https://www.gnu.org/licenses/>.
⭐ Star History
📄 License
|
This project is licensed under GNU_GPL_v3.0. |