From b507cd222be51e600b93bf980218cdd062cf15cd Mon Sep 17 00:00:00 2001 From: Kenneth Choe Date: Sat, 13 Apr 2024 17:54:33 -0500 Subject: [PATCH] docs: changed the link to more helpful source (#20411) docs: changed a link to better source [Previous link](https://www.philschmid.de/custom-inference-huggingface-sagemaker) is about how to upload embeddings model. [New link](https://huggingface.co/blog/kchoe/deploy-any-huggingface-model-to-sagemaker) is about how to upload cross encoder model, which directly addresses what is needed here. For full disclosure, I wrote this article and the sample `inference.py` is the result of this new article. Co-authored-by: Kenny Choe --- .../document_transformers/cross_encoder_reranker.ipynb | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/docs/integrations/document_transformers/cross_encoder_reranker.ipynb b/docs/docs/integrations/document_transformers/cross_encoder_reranker.ipynb index fb5e52bfb0..fd06ad72b8 100644 --- a/docs/docs/integrations/document_transformers/cross_encoder_reranker.ipynb +++ b/docs/docs/integrations/document_transformers/cross_encoder_reranker.ipynb @@ -175,7 +175,7 @@ "source": [ "## Uploading Hugging Face model to SageMaker endpoint\n", "\n", - "Refer to [this article](https://www.philschmid.de/custom-inference-huggingface-sagemaker) for general guideline. Here is a simple `inference.py` for creating an endpoint that works with `SagemakerEndpointCrossEncoder`.\n", + "Here is a sample `inference.py` for creating an endpoint that works with `SagemakerEndpointCrossEncoder`. For more details with step-by-step guidance, refer to [this article](https://huggingface.co/blog/kchoe/deploy-any-huggingface-model-to-sagemaker). \n", "\n", "It downloads Hugging Face model on the fly, so you do not need to keep the model artifacts such as `pytorch_model.bin` in your `model.tar.gz`." ]