You cannot select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
gpt4free/docs/client.md

140 lines
3.1 KiB
Markdown

### G4F - Client API (Beta Version)
5 months ago
#### Introduction
Welcome to the G4F Client API, a cutting-edge tool for seamlessly integrating advanced AI capabilities into your Python applications. This guide is designed to facilitate your transition from using the OpenAI client to the G4F Client, offering enhanced features while maintaining compatibility with the existing OpenAI API.
5 months ago
#### Getting Started
5 months ago
**Switching to G4F Client:**
To begin using the G4F Client, simply update your import statement in your Python code:
5 months ago
Old Import:
```python
5 months ago
from openai import OpenAI
```
5 months ago
New Import:
```python
5 months ago
from g4f.client import Client as OpenAI
```
The G4F Client preserves the same familiar API interface as OpenAI, ensuring a smooth transition process.
### Initializing the Client
To utilize the G4F Client, create an new instance. Below is an example showcasing custom providers:
```python
from g4f.client import Client
5 months ago
from g4f.Provider import BingCreateImages, OpenaiChat, Gemini
client = Client(
provider=OpenaiChat,
image_provider=Gemini,
...
)
```
## Configuration
You can set an "api_key" for your provider in client.
And you also have the option to define a proxy for all outgoing requests:
```python
from g4f.client import Client
client = Client(
api_key="...",
proxies="http://user:pass@host",
...
)
```
5 months ago
#### Usage Examples
**Text Completions:**
5 months ago
You can use the `ChatCompletions` endpoint to generate text completions as follows:
```python
response = client.chat.completions.create(
model="gpt-3.5-turbo",
messages=[{"role": "user", "content": "Say this is a test"}],
...
)
print(response.choices[0].message.content)
```
Also streaming are supported:
```python
stream = client.chat.completions.create(
model="gpt-4",
messages=[{"role": "user", "content": "Say this is a test"}],
stream=True,
...
)
for chunk in stream:
5 months ago
if chunk.choices[0].delta.content:
print(chunk.choices[0].delta.content or "", end="")
```
5 months ago
**Image Generation:**
Generate images using a specified prompt:
```python
response = client.images.generate(
5 months ago
model="dall-e-3",
prompt="a white siamese cat",
...
)
image_url = response.data[0].url
```
5 months ago
**Creating Image Variations:**
Create variations of an existing image:
```python
response = client.images.create_variation(
5 months ago
image=open("cat.jpg", "rb"),
model="bing",
...
)
image_url = response.data[0].url
```
5 months ago
#### Visual Examples
Original / Variant:
[![Original Image](/docs/cat.jpeg)](/docs/client.md) [![Variant Image](/docs/cat.webp)](/docs/client.md)
#### Advanced example using GeminiProVision
```python
from g4f.client import Client
from g4f.Provider.GeminiPro import GeminiPro
client = Client(
api_key="...",
provider=GeminiPro
)
response = client.chat.completions.create(
model="gemini-pro-vision",
messages=[{"role": "user", "content": "What are on this image?"}],
image=open("docs/cat.jpeg", "rb")
)
print(response.choices[0].message.content)
```
**Question:** What are on this image?
```
A cat is sitting on a window sill looking at a bird outside the window.
```
[Return to Home](/)