diff --git a/docs/additional_resources/deploy_llms.rst b/docs/additional_resources/deploy_llms.rst index 1c342ee6..e2eb3d3f 100644 --- a/docs/additional_resources/deploy_llms.rst +++ b/docs/additional_resources/deploy_llms.rst @@ -24,9 +24,9 @@ This guide aims to provide a comprehensive overview of the requirements for depl Understanding these components is crucial when assessing serving systems. LangChain integrates with several open-source projects designed to tackle these issues, providing a robust framework for productionizing your LLM applications. Some notable frameworks include: -- `Ray Serve <../../../ecosystem/ray_serve.html>`_ +- `Ray Serve <../integrations/ray_serve.html>`_ - `BentoML `_ -- `Modal <../../../ecosystem/modal.html>`_ +- `Modal <../integrations/modal.html>`_ These links will provide further information on each ecosystem, assisting you in finding the best fit for your LLM deployment needs. diff --git a/docs/ecosystem/ray_serve.ipynb b/docs/integrations/ray_serve.ipynb similarity index 100% rename from docs/ecosystem/ray_serve.ipynb rename to docs/integrations/ray_serve.ipynb