mirror of
https://github.com/hwchase17/langchain
synced 2024-11-08 07:10:35 +00:00
b3988621c5
# Add C Transformers for GGML Models I created Python bindings for the GGML models: https://github.com/marella/ctransformers Currently it supports GPT-2, GPT-J, GPT-NeoX, LLaMA, MPT, etc. See [Supported Models](https://github.com/marella/ctransformers#supported-models). It provides a unified interface for all models: ```python from langchain.llms import CTransformers llm = CTransformers(model='/path/to/ggml-gpt-2.bin', model_type='gpt2') print(llm('AI is going to')) ``` It can be used with models hosted on the Hugging Face Hub: ```py llm = CTransformers(model='marella/gpt-2-ggml') ``` It supports streaming: ```py from langchain.callbacks.streaming_stdout import StreamingStdOutCallbackHandler llm = CTransformers(model='marella/gpt-2-ggml', callbacks=[StreamingStdOutCallbackHandler()]) ``` Please see [README](https://github.com/marella/ctransformers#readme) for more details. --------- Co-authored-by: Dev 2049 <dev.dev2049@gmail.com> |
||
---|---|---|
.. | ||
__init__.py | ||
test_ai21.py | ||
test_aleph_alpha.py | ||
test_anthropic.py | ||
test_anyscale.py | ||
test_banana.py | ||
test_beam.py | ||
test_cerebrium.py | ||
test_cohere.py | ||
test_ctransformers.py | ||
test_forefrontai.py | ||
test_google_palm.py | ||
test_gooseai.py | ||
test_gpt4all.py | ||
test_huggingface_endpoint.py | ||
test_huggingface_hub.py | ||
test_huggingface_pipeline.py | ||
test_llamacpp.py | ||
test_manifest.py | ||
test_modal.py | ||
test_mosaicml.py | ||
test_nlpcloud.py | ||
test_openai.py | ||
test_openlm.py | ||
test_petals.py | ||
test_pipelineai.py | ||
test_predictionguard.py | ||
test_promptlayer_openai.py | ||
test_propmptlayer_openai_chat.py | ||
test_replicate.py | ||
test_rwkv.py | ||
test_self_hosted_llm.py | ||
test_stochasticai.py | ||
test_vertexai.py | ||
test_writer.py | ||
utils.py |