Clean up Fireworks provider documentation (#13157)

pull/13164/head
Chenyu Zhao 11 months ago committed by GitHub
parent d9e493e96c
commit defd4b4f11
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

@ -1,22 +1,47 @@
# Fireworks
This page covers how to use the Fireworks models within Langchain.
This page covers how to use [Fireworks](https://app.fireworks.ai/) models within
Langchain.
## Installation and Setup
## Installation and setup
- To use the Fireworks model, you need to have a Fireworks API key. To generate one, sign up at [app.fireworks.ai](https://app.fireworks.ai).
- Install the Fireworks client library.
```
pip install fireworks-ai
```
- Get a Fireworks API key by signing up at [app.fireworks.ai](https://app.fireworks.ai).
- Authenticate by setting the FIREWORKS_API_KEY environment variable.
## LLM
## Authentication
There are two ways to authenticate using your Fireworks API key:
1. Setting the `FIREWORKS_API_KEY` environment variable.
```python
os.environ["FIREWORKS_API_KEY"] = "<KEY>"
```
2. Setting `fireworks_api_key` field in the Fireworks LLM module.
```python
llm = Fireworks(fireworks_api_key="<KEY>")
```
Fireworks integrates with Langchain through the LLM module, which allows for standardized usage of any models deployed on the Fireworks models.
## Using the Fireworks LLM module
In this example, we'll work the llama-v2-13b-chat model.
Fireworks integrates with Langchain through the LLM module. In this example, we
will work the llama-v2-13b-chat model.
```python
from langchain.llms.fireworks import Fireworks
llm = Fireworks(model="fireworks-llama-v2-13b-chat", max_tokens=256, temperature=0.4)
llm = Fireworks(
fireworks_api_key="<KEY>",
model="accounts/fireworks/models/llama-v2-13b-chat",
max_tokens=256)
llm("Name 3 sports.")
```

Loading…
Cancel
Save