mirror of
https://github.com/hwchase17/langchain
synced 2024-10-29 17:07:25 +00:00
f6f0b0f975
Fixed Typo in bittenaor.mdx --------- Co-authored-by: Aashish Saini <141953346+AashishSainiShorthillsAI@users.noreply.github.com> Co-authored-by: AryamanJaiswalShorthillsAI <142397527+AryamanJaiswalShorthillsAI@users.noreply.github.com> Co-authored-by: Adarsh Shrivastav <142413097+AdarshKumarShorthillsAI@users.noreply.github.com> Co-authored-by: Vishal <141389263+VishalYadavShorthillsAI@users.noreply.github.com> Co-authored-by: ChetnaGuptaShorthillsAI <142381084+ChetnaGuptaShorthillsAI@users.noreply.github.com> Co-authored-by: PankajKumarShorthillsAI <142473460+PankajKumarShorthillsAI@users.noreply.github.com> Co-authored-by: AbhishekYadavShorthillsAI <142393903+AbhishekYadavShorthillsAI@users.noreply.github.com> Co-authored-by: Aayush <142384656+AayushShorthillsAI@users.noreply.github.com>
38 lines
975 B
Plaintext
38 lines
975 B
Plaintext
# NIBittensor
|
|
|
|
This page covers how to use the BittensorLLM inference runtime within LangChain.
|
|
It is broken into two parts: installation and setup, and then examples of NIBittensorLLM usage.
|
|
|
|
## Installation and Setup
|
|
|
|
- Install the Python package with `pip install langchain`
|
|
|
|
## Wrappers
|
|
|
|
### LLM
|
|
|
|
There exists a NIBittensor LLM wrapper, which you can access with:
|
|
|
|
```python
|
|
from langchain.llms import NIBittensorLLM
|
|
```
|
|
|
|
It provides a unified interface for all models:
|
|
|
|
```python
|
|
llm = NIBittensorLLM(system_prompt="Your task is to provide concise and accurate response based on user prompt")
|
|
|
|
print(llm('Write a fibonacci function in python with golder ratio'))
|
|
```
|
|
|
|
Multiple responses from top miners can be accessible using the `top_responses` parameter:
|
|
|
|
```python
|
|
multi_response_llm = NIBittensorLLM(top_responses=10)
|
|
multi_resp = multi_response_llm("What is Neural Network Feeding Mechanism?")
|
|
json_multi_resp = json.loads(multi_resp)
|
|
|
|
print(json_multi_resp)
|
|
```
|
|
|