langchain/docs/modules/models/llms
Ravindra Marella b3988621c5
Add C Transformers for GGML Models (#5218)
# Add C Transformers for GGML Models
I created Python bindings for the GGML models:
https://github.com/marella/ctransformers

Currently it supports GPT-2, GPT-J, GPT-NeoX, LLaMA, MPT, etc. See
[Supported
Models](https://github.com/marella/ctransformers#supported-models).


It provides a unified interface for all models:

```python
from langchain.llms import CTransformers

llm = CTransformers(model='/path/to/ggml-gpt-2.bin', model_type='gpt2')

print(llm('AI is going to'))
```

It can be used with models hosted on the Hugging Face Hub:

```py
llm = CTransformers(model='marella/gpt-2-ggml')
```

It supports streaming:

```py
from langchain.callbacks.streaming_stdout import StreamingStdOutCallbackHandler

llm = CTransformers(model='marella/gpt-2-ggml', callbacks=[StreamingStdOutCallbackHandler()])
```

Please see [README](https://github.com/marella/ctransformers#readme) for
more details.
---------

Co-authored-by: Dev 2049 <dev.dev2049@gmail.com>
2023-05-25 13:42:44 -07:00
..
examples docs: fix minor typo + add wikipedia package installation part in human_input_llm.ipynb (#5118) 2023-05-23 10:59:30 -07:00
integrations Add C Transformers for GGML Models (#5218) 2023-05-25 13:42:44 -07:00
getting_started.ipynb docs: update getting_started.ipynb (#2883) 2023-04-14 07:40:26 -07:00
how_to_guides.rst big docs refactor (#1978) 2023-03-26 19:49:46 -07:00
integrations.rst big docs refactor (#1978) 2023-03-26 19:49:46 -07:00