diff --git a/docs/ecosystem/huggingface.md b/docs/ecosystem/huggingface.md index 1595c21fb5..c12840144e 100644 --- a/docs/ecosystem/huggingface.md +++ b/docs/ecosystem/huggingface.md @@ -1,15 +1,16 @@ # Hugging Face -This page covers how to use the Hugging Face ecosystem (including the Hugging Face Hub) within LangChain. +This page covers how to use the Hugging Face ecosystem (including the [Hugging Face Hub](https://huggingface.co)) within LangChain. It is broken into two parts: installation and setup, and then references to specific Hugging Face wrappers. ## Installation and Setup If you want to work with the Hugging Face Hub: -- Install the Python SDK with `pip install huggingface_hub` -- Get an OpenAI api key and set it as an environment variable (`HUGGINGFACEHUB_API_TOKEN`) +- Install the Hub client library with `pip install huggingface_hub` +- Create a Hugging Face account (it's free!) +- Create an [access token](https://huggingface.co/docs/hub/security-tokens) and set it as an environment variable (`HUGGINGFACEHUB_API_TOKEN`) -If you want work with Hugging Face python libraries: +If you want work with the Hugging Face Python libraries: - Install `pip install transformers` for working with models and tokenizers - Install `pip install datasets` for working with datasets @@ -18,7 +19,7 @@ If you want work with Hugging Face python libraries: ### LLM There exists two Hugging Face LLM wrappers, one for a local pipeline and one for a model hosted on Hugging Face Hub. -Note that these wrappers only work for the following tasks: `text2text-generation`, `text-generation` +Note that these wrappers only work for models that support the following tasks: [`text2text-generation`](https://huggingface.co/models?library=transformers&pipeline_tag=text2text-generation&sort=downloads), [`text-generation`](https://huggingface.co/models?library=transformers&pipeline_tag=text-classification&sort=downloads) To use the local pipeline wrapper: ```python @@ -35,7 +36,7 @@ For a more detailed walkthrough of the Hugging Face Hub wrapper, see [this noteb ### Embeddings There exists two Hugging Face Embeddings wrappers, one for a local model and one for a model hosted on Hugging Face Hub. -Note that these wrappers only work for `sentence-transformers` models. +Note that these wrappers only work for [`sentence-transformers` models](https://huggingface.co/models?library=sentence-transformers&sort=downloads). To use the local pipeline wrapper: ```python @@ -63,6 +64,6 @@ For a more detailed walkthrough of this, see [this notebook](../modules/utils/co ### Datasets -Hugging Face has lots of great datasets that can be used to evaluate your LLM chains. +The Hugging Face Hub has lots of great [datasets](https://huggingface.co/datasets) that can be used to evaluate your LLM chains. For a detailed walkthrough of how to use them to do so, see [this notebook](../use_cases/evaluation/huggingface_datasets.ipynb)