decentralising the Ai Industry, just some language model api's...
Go to file
H Lohaus 4c3472f541
Merge pull request #1970 from hlohaus/leech
Improve Liabots provider, Add image api support
2024-05-18 08:12:17 +02:00
.github Merge pull request #1807 from hlohaus/satur 2024-04-09 19:40:42 +02:00
docker Add nodriver to docker 2024-04-23 05:44:49 +02:00
docs Added missing periods. 2024-05-03 11:06:01 -04:00
etc Improve Liabots provider, Add image api support 2024-05-18 07:37:37 +02:00
g4f Improve Liabots provider, Add image api support 2024-05-18 07:37:37 +02:00
har_and_cookies Add Replicate Provider 2024-04-21 22:39:00 +02:00
models Add local models to gui, Fix You Provider, add AsyncClient 2024-04-07 10:36:13 +02:00
projects Some small fixes 2024-04-10 14:23:30 +02:00
.gitattributes ~ 2023-08-14 01:04:42 +02:00
.gitignore Add Replicate Provider 2024-04-21 22:39:00 +02:00
.gitpod.yml Create .gitpod.yml 2023-10-30 01:06:52 +05:30
CODE_OF_CONDUCT.md ~ 2023-08-14 01:04:42 +02:00
CONTRIBUTING.md docs: fix typo in CONTRIBUTING.md 2024-05-02 17:33:57 +07:00
docker-compose.yml Change default port for gui 2023-12-11 02:50:33 +01:00
LEGAL_NOTICE.md ~ 2024-01-23 12:56:56 +00:00
LICENSE ~ 2023-08-14 01:04:42 +02:00
MANIFEST.in Add MissingRequirementsError to You 2024-04-21 07:38:36 +02:00
README.md Update README.md 2024-05-15 02:45:06 +02:00
requirements-min.txt Update min requirements, Add pycryptodome 2024-03-25 21:19:13 +01:00
requirements.txt Update requirements.txt 2024-04-07 20:03:23 +02:00
SECURITY.md ~ 2023-08-14 01:04:42 +02:00
setup.py Some small fixes 2024-04-10 14:23:30 +02:00

248433934-7886223b-c1d1-4260-82aa-da5741f303bb

xtekky%2Fgpt4free | Trendshift

Written by @xtekky & maintained by @hlohaus

By using this repository or any code related to it, you agree to the legal notice. The author is not responsible for the usage of this repository nor endorses it, nor is the author responsible for any copies, forks, re-uploads made by other users, or anything else related to GPT4Free. This is the author's only account and repository. To prevent impersonation or irresponsible actions, please comply with the GNU GPL license this Repository uses.

Warning

"gpt4free" serves as a PoC (proof of concept), demonstrating the development of an API package with multi-provider requests, with features like timeouts, load balance and flow control.

Note

Lastet version: PyPI version Docker version
Stats: Downloads Downloads

pip install -U g4f
docker pull hlohaus789/g4f

🆕 What's New

🔻 Site Takedown

Is your site on this repository and you want to take it down? Send an email to takedown@g4f.ai with proof it is yours and it will be removed as fast as possible. To prevent reproduction please secure your API. 😉

🚀 Feedback and Todo

You can always leave some feedback here: https://forms.gle/FeWV9RLEedfdkmFN6

As per the survey, here is a list of improvements to come

  • Update the repository to include the new openai library syntax (ex: Openai() class) | completed, use g4f.client.Client
  • Golang implementation
  • 🚧 Improve Documentation (in /docs & Guides, Howtos, & Do video tutorials)
  • Improve the provider status list & updates
  • Tutorials on how to reverse sites to write your own wrapper (PoC only ofc)
  • Improve the Bing wrapper. (Wait and Retry or reuse conversation)
  • 🚧 Write a standard provider performance test to improve the stability
  • Potential support and development of local models
  • 🚧 Improve compatibility and error handling

📚 Table of Contents

🛠️ Getting Started

Docker Container Guide

Getting Started Quickly:
  1. Install Docker: Begin by downloading and installing Docker.

  2. Set Up the Container: Use the following commands to pull the latest image and start the container:

docker pull hlohaus789/g4f
docker run -p 8080:8080 -p 1337:1337 -p 7900:7900 --shm-size="2g" -v ${PWD}/har_and_cookies:/app/har_and_cookies hlohaus789/g4f:latest
  1. Access the Client:

  2. (Optional) Provider Login: If required, you can access the container's desktop here: http://localhost:7900/?autoconnect=1&resize=scale&password=secret for provider login purposes.

Installation Guide for Windows (.exe)

To ensure the seamless operation of our application, please follow the instructions below. These steps are designed to guide you through the installation process on Windows operating systems.

Prerequisites
  1. WebView2 Runtime: Our application requires the WebView2 Runtime to be installed on your system. If you do not have it installed, please download and install it from the Microsoft Developer Website. If you already have WebView2 Runtime installed but are encountering issues, navigate to Installed Windows Apps, select WebView2, and opt for the repair option.
Installation Steps
  1. Download the Application: Visit our latest releases page and download the most recent version of the application, named g4f.webview.*.exe.
  2. File Placement: Once downloaded, transfer the .exe file from your downloads folder to a directory of your choice on your system, and then execute it to run the app.
Post-Installation Adjustment
  1. Firewall Configuration (Hotfix): Upon installation, it may be necessary to adjust your Windows Firewall settings to allow the application to operate correctly. To do this, access your Windows Firewall settings and allow the application.

By following these steps, you should be able to successfully install and run the application on your Windows system. If you encounter any issues during the installation process, please refer to our Issue Tracker or try to get contact over Discord for assistance.

Run the Webview UI on other Platfroms:

Use your smartphone:

Run the Web UI on Your Smartphone:

Use python

Prerequisites:
  1. Download and install Python (Version 3.10+ is recommended).
  2. Install Google Chrome for providers with webdriver
Install using PyPI package:
pip install -U g4f[all]

How do I install only parts or do disable parts? Use partial requirements: /docs/requirements

Install from source:

How do I load the project using git and installing the project requirements? Read this tutorial and follow it step by step: /docs/git

Install using Docker:

How do I build and run composer image from source? Use docker-compose: /docs/docker

💡 Usage

Text Generation

from g4f.client import Client

client = Client()
response = client.chat.completions.create(
    model="gpt-3.5-turbo",
    messages=[{"role": "user", "content": "Hello"}],
    ...
)
print(response.choices[0].message.content)
Hello! How can I assist you today?

Image Generation

from g4f.client import Client

client = Client()
response = client.images.generate(
  model="gemini",
  prompt="a white siamese cat",
  ...
)
image_url = response.data[0].url

Image with cat

Full Documentation for Python API

Web UI

To start the web interface, type the following codes in python:

from g4f.gui import run_gui
run_gui()

or execute the following command:

python -m g4f.cli gui -port 8080 -debug

Interference API

You can use the Interference API to serve other OpenAI integrations with G4F.

See docs: /docs/interference

Access with: http://localhost:1337/v1

Configuration

Cookies

Cookies are essential for using Meta AI and Microsoft Designer to create images. Additionally, cookies are required for the Google Gemini and WhiteRabbitNeo Provider. From Bing, ensure you have the "_U" cookie, and from Google, all cookies starting with "__Secure-1PSID" are needed.

You can pass these cookies directly to the create function or set them using the set_cookies method before running G4F:

from g4f.cookies import set_cookies

set_cookies(".bing.com", {
  "_U": "cookie value"
})

set_cookies(".google.com", {
  "__Secure-1PSID": "cookie value"
})

Alternatively, you can place your .har and cookie files in the /har_and_cookies directory. To export a cookie file, use the EditThisCookie extension available on the Chrome Web Store: EditThisCookie Extension.

You can also create .har files to capture cookies. If you need further assistance, refer to the next section.

python -m g4f.cli api --debug
Read .har file: ./har_and_cookies/you.com.har
Cookies added: 10 from .you.com
Read cookie file: ./har_and_cookies/google.json
Cookies added: 16 from .google.com
Starting server... [g4f v-0.0.0] (debug)

.HAR File for OpenaiChat Provider

Generating a .HAR File

To utilize the OpenaiChat provider, a .har file is required from https://chat.openai.com/. Follow the steps below to create a valid .har file:

  1. Navigate to https://chat.openai.com/ using your preferred web browser and log in with your credentials.
  2. Access the Developer Tools in your browser. This can typically be done by right-clicking the page and selecting "Inspect," or by pressing F12 or Ctrl+Shift+I (Cmd+Option+I on a Mac).
  3. With the Developer Tools open, switch to the "Network" tab.
  4. Reload the website to capture the loading process within the Network tab.
  5. Initiate an action in the chat which can be captured in the .har file.
  6. Right-click any of the network activities listed and select "Save all as HAR with content" to export the .har file.
Storing the .HAR File
  • Place the exported .har file in the ./har_and_cookies directory if you are using Docker. Alternatively, you can store it in any preferred location within your current working directory.

Note: Ensure that your .har file is stored securely, as it may contain sensitive information.

Using Proxy

If you want to hide or change your IP address for the providers, you can set a proxy globally via an environment variable:

  • On macOS and Linux:
export G4F_PROXY="http://host:port"
  • On Windows:
set G4F_PROXY=http://host:port

🚀 Providers and Models

GPT-4

Website Provider GPT-3.5 GPT-4 Stream Status Auth
bing.com g4f.Provider.Bing ✔️ ✔️ Active
chatgpt.ai g4f.Provider.ChatgptAi ✔️ ✔️ Unknown
liaobots.site g4f.Provider.Liaobots ✔️ ✔️ ✔️ Unknown
chat.openai.com g4f.Provider.OpenaiChat ✔️ ✔️ ✔️ Active +✔️
raycast.com g4f.Provider.Raycast ✔️ ✔️ ✔️ Unknown ✔️
beta.theb.ai g4f.Provider.Theb ✔️ ✔️ ✔️ Unknown
you.com g4f.Provider.You ✔️ ✔️ ✔️ Active

Best OpenSource Models

While we wait for gpt-5, here is a list of new models that are at least better than gpt-3.5-turbo. Some are better than gpt-4. Expect this list to grow.

Website Provider parameters better than
claude-3-opus g4f.Provider.You ?B gpt-4-0125-preview
command-r+ g4f.Provider.HuggingChat 104B gpt-4-0314
llama-3-70b g4f.Provider.Llama or DeepInfra 70B gpt-4-0314
claude-3-sonnet g4f.Provider.You ?B gpt-4-0314
reka-core g4f.Provider.Reka 21B gpt-4-vision
dbrx-instruct g4f.Provider.DeepInfra 132B / 36B active gpt-3.5-turbo
mixtral-8x22b g4f.Provider.DeepInfra 176B / 44b active gpt-3.5-turbo

GPT-3.5

Website Provider GPT-3.5 GPT-4 Stream Status Auth
chat3.aiyunos.top g4f.Provider.AItianhuSpace ✔️ ✔️ Unknown
chat10.aichatos.xyz g4f.Provider.Aichatos ✔️ ✔️ Active
chatforai.store g4f.Provider.ChatForAi ✔️ ✔️ Unknown
chatgpt4online.org g4f.Provider.Chatgpt4Online ✔️ ✔️ Unknown
chatgpt-free.cc g4f.Provider.ChatgptNext ✔️ ✔️ Unknown
chatgptx.de g4f.Provider.ChatgptX ✔️ ✔️ Unknown
f1.cnote.top g4f.Provider.Cnote ✔️ ✔️ Active
duckduckgo.com g4f.Provider.DuckDuckGo ✔️ ✔️ Active
ecosia.org g4f.Provider.Ecosia ✔️ ✔️ Active
feedough.com g4f.Provider.Feedough ✔️ ✔️ Active
flowgpt.com g4f.Provider.FlowGpt ✔️ ✔️ Unknown
freegptsnav.aifree.site g4f.Provider.FreeGpt ✔️ ✔️ Active
gpttalk.ru g4f.Provider.GptTalkRu ✔️ ✔️ Unknown
koala.sh g4f.Provider.Koala ✔️ ✔️ Unknown
app.myshell.ai g4f.Provider.MyShell ✔️ ✔️ Unknown
perplexity.ai g4f.Provider.PerplexityAi ✔️ ✔️ Unknown
poe.com g4f.Provider.Poe ✔️ ✔️ Unknown ✔️
talkai.info g4f.Provider.TalkAi ✔️ ✔️ Unknown
chat.vercel.ai g4f.Provider.Vercel ✔️ ✔️ Unknown
aitianhu.com g4f.Provider.AItianhu ✔️ ✔️ Inactive
chatgpt.bestim.org g4f.Provider.Bestim ✔️ ✔️ Inactive
chatbase.co g4f.Provider.ChatBase ✔️ ✔️ Inactive
chatgptdemo.info g4f.Provider.ChatgptDemo ✔️ ✔️ Inactive
chat.chatgptdemo.ai g4f.Provider.ChatgptDemoAi ✔️ ✔️ Inactive
chatgptfree.ai g4f.Provider.ChatgptFree ✔️ Inactive
chatgptlogin.ai g4f.Provider.ChatgptLogin ✔️ ✔️ Inactive
chat.3211000.xyz g4f.Provider.Chatxyz ✔️ ✔️ Inactive
gpt6.ai g4f.Provider.Gpt6 ✔️ ✔️ Inactive
gptchatly.com g4f.Provider.GptChatly ✔️ Inactive
ai18.gptforlove.com g4f.Provider.GptForLove ✔️ ✔️ Inactive
gptgo.ai g4f.Provider.GptGo ✔️ ✔️ Inactive
gptgod.site g4f.Provider.GptGod ✔️ ✔️ Inactive
onlinegpt.org g4f.Provider.OnlineGpt ✔️ ✔️ Inactive

Other

Website Provider Stream Status Auth
openchat.team g4f.Provider.Aura ✔️ Unknown
blackbox.ai g4f.Provider.Blackbox ✔️ Active
cohereforai-c4ai-command-r-plus.hf.space g4f.Provider.Cohere ✔️ Unknown
deepinfra.com g4f.Provider.DeepInfra ✔️ Active
free.chatgpt.org.uk g4f.Provider.FreeChatgpt ✔️ Unknown
gemini.google.com g4f.Provider.Gemini ✔️ Active ✔️
ai.google.dev g4f.Provider.GeminiPro ✔️ Active ✔️
gemini-chatbot-sigma.vercel.app g4f.Provider.GeminiProChat ✔️ Unknown
developers.sber.ru g4f.Provider.GigaChat ✔️ Unknown ✔️
console.groq.com g4f.Provider.Groq ✔️ Active ✔️
huggingface.co g4f.Provider.HuggingChat ✔️ Active
huggingface.co g4f.Provider.HuggingFace ✔️ Active
llama2.ai g4f.Provider.Llama ✔️ Unknown
meta.ai g4f.Provider.MetaAI ✔️ Active
openrouter.ai g4f.Provider.OpenRouter ✔️ Active ✔️
labs.perplexity.ai g4f.Provider.PerplexityLabs ✔️ Active
pi.ai g4f.Provider.Pi ✔️ Unknown
replicate.com g4f.Provider.Replicate ✔️ Unknown
theb.ai g4f.Provider.ThebApi ✔️ Unknown ✔️
whiterabbitneo.com g4f.Provider.WhiteRabbitNeo ✔️ Unknown ✔️
bard.google.com g4f.Provider.Bard Inactive ✔️

Models

Model Base Provider Provider Website
gpt-3.5-turbo OpenAI 8+ Providers openai.com
gpt-4 OpenAI 2+ Providers openai.com
gpt-4-turbo OpenAI g4f.Provider.Bing openai.com
Llama-2-7b-chat-hf Meta 2+ Providers llama.meta.com
Llama-2-13b-chat-hf Meta 2+ Providers llama.meta.com
Llama-2-70b-chat-hf Meta 3+ Providers llama.meta.com
Meta-Llama-3-8b-instruct Meta 1+ Providers llama.meta.com
Meta-Llama-3-70b-instruct Meta 2+ Providers llama.meta.com
CodeLlama-34b-Instruct-hf Meta g4f.Provider.HuggingChat llama.meta.com
CodeLlama-70b-Instruct-hf Meta 2+ Providers llama.meta.com
Mixtral-8x7B-Instruct-v0.1 Huggingface 4+ Providers huggingface.co
Mistral-7B-Instruct-v0.1 Huggingface 3+ Providers huggingface.co
Mistral-7B-Instruct-v0.2 Huggingface g4f.Provider.DeepInfra huggingface.co
zephyr-orpo-141b-A35b-v0.1 Huggingface 2+ Providers huggingface.co
dolphin-2.6-mixtral-8x7b Huggingface g4f.Provider.DeepInfra huggingface.co
gemini Google g4f.Provider.Gemini gemini.google.com
gemini-pro Google 2+ Providers gemini.google.com
claude-v2 Anthropic 1+ Providers anthropic.com
claude-3-opus Anthropic g4f.Provider.You anthropic.com
claude-3-sonnet Anthropic g4f.Provider.You anthropic.com
lzlv_70b_fp16_hf Huggingface g4f.Provider.DeepInfra huggingface.co
airoboros-70b Huggingface g4f.Provider.DeepInfra huggingface.co
openchat_3.5 Huggingface 2+ Providers huggingface.co
pi Inflection g4f.Provider.Pi inflection.ai

Image and Vision Models

Label Provider Image Model Vision Model Website
Microsoft Copilot in Bing g4f.Provider.Bing dall-e-3 gpt-4-vision bing.com
DeepInfra g4f.Provider.DeepInfra stability-ai/sdxl llava-1.5-7b-hf deepinfra.com
Gemini g4f.Provider.Gemini ✔️ ✔️ gemini.google.com
Gemini API g4f.Provider.GeminiPro gemini-1.5-pro ai.google.dev
Meta AI g4f.Provider.MetaAI ✔️ meta.ai
OpenAI ChatGPT g4f.Provider.OpenaiChat dall-e-3 gpt-4-vision chat.openai.com
Reka g4f.Provider.Reka ✔️ chat.reka.ai
Replicate g4f.Provider.Replicate stability-ai/sdxl llava-v1.6-34b replicate.com
You.com g4f.Provider.You dall-e-3 ✔️ you.com
import requests
from g4f.client import Client

client = Client()
image = requests.get("https://change_me.jpg", stream=True).raw
response = client.chat.completions.create(
    "",
    messages=[{"role": "user", "content": "what is in this picture?"}],
    image=image
)
print(response.choices[0].message.content)

🔗 Powered by gpt4free

🎁 Projects Stars 📚 Forks 🛎 Issues 📬 Pull requests
gpt4free Stars Forks Issues Pull Requests
gpt4free-ts Stars Forks Issues Pull Requests
Free AI API's & Potential Providers List Stars Forks Issues Pull Requests
ChatGPT-Clone Stars Forks Issues Pull Requests
Ai agent Stars Forks Issues Pull Requests
ChatGpt Discord Bot Stars Forks Issues Pull Requests
chatGPT-discord-bot Stars Forks Issues Pull Requests
Nyx-Bot (Discord) Stars Forks Issues Pull Requests
LangChain gpt4free Stars Forks Issues Pull Requests
ChatGpt Telegram Bot Stars Forks Issues Pull Requests
ChatGpt Line Bot Stars Forks Issues Pull Requests
Action Translate Readme Stars Forks Issues Pull Requests
Langchain Document GPT Stars Forks Issues Pull Requests
python-tgpt Stars Forks Issues Pull Requests

🤝 Contribute

We welcome contributions from the community. Whether you're adding new providers or features, or simply fixing typos and making small improvements, your input is valued. Creating a pull request is all it takes our co-pilot will handle the code review process. Once all changes have been addressed, we'll merge the pull request into the main branch and release the updates at a later time.

Guide: How do i create a new Provider?
Guide: How can AI help me with writing code?

🙌 Contributors

A list of all contributors is available here

Having input implies that the AI's code generation utilized it as one of many sources.

This program is licensed under the GNU GPL v3

xtekky/gpt4free: Copyright (C) 2023 xtekky

This program is free software: you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.

This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
GNU General Public License for more details.

You should have received a copy of the GNU General Public License
along with this program.  If not, see <https://www.gnu.org/licenses/>.

Star History

Star History Chart

📄 License


This project is licensed under GNU_GPL_v3.0.

(🔼 Back to top)