From 8e44363ec9255fbbe8c26e2eb4a514f788008c93 Mon Sep 17 00:00:00 2001 From: Abhinav <60320192+blacksmithop@users.noreply.github.com> Date: Mon, 29 Jan 2024 22:11:29 +0530 Subject: [PATCH] langchain_community: Update documentation for installing llama-cpp-python on windows (#16666) **Description** : This PR updates the documentation for installing llama-cpp-python on Windows. - Updates install command to support pyproject.toml - Makes CPU/GPU install instructions clearer - Adds reinstall with GPU support command **Issue**: Existing [documentation](https://python.langchain.com/docs/integrations/llms/llamacpp#compiling-and-installing) lists the following commands for installing llama-cpp-python ``` python setup.py clean python setup.py install ```` The current version of the repo does not include a `setup.py` and uses a `pyproject.toml` instead. This can be replaced with ``` python -m pip install -e . ``` As explained in https://github.com/abetlen/llama-cpp-python/issues/965#issuecomment-1837268339 **Dependencies**: None **Twitter handle**: None --------- Co-authored-by: blacksmithop --- docs/docs/integrations/llms/llamacpp.ipynb | 26 +++++++++++++++++----- 1 file changed, 21 insertions(+), 5 deletions(-) diff --git a/docs/docs/integrations/llms/llamacpp.ipynb b/docs/docs/integrations/llms/llamacpp.ipynb index 58bb7f38d8..a3a22acb7b 100644 --- a/docs/docs/integrations/llms/llamacpp.ipynb +++ b/docs/docs/integrations/llms/llamacpp.ipynb @@ -144,24 +144,40 @@ "git clone --recursive -j8 https://github.com/abetlen/llama-cpp-python.git\n", "```\n", "\n", - "2. Open up command Prompt (or anaconda prompt if you have it installed), set up environment variables to install. Follow this if you do not have a GPU, you must set both of the following variables.\n", + "2. Open up a command Prompt and set the following environment variables.\n", + "\n", "\n", "```\n", "set FORCE_CMAKE=1\n", "set CMAKE_ARGS=-DLLAMA_CUBLAS=OFF\n", "```\n", - "You can ignore the second environment variable if you have an NVIDIA GPU.\n", + "If you have an NVIDIA GPU make sure `DLLAMA_CUBLAS` is set to `ON`\n", "\n", "#### Compiling and installing\n", "\n", - "In the same command prompt (anaconda prompt) you set the variables, you can `cd` into `llama-cpp-python` directory and run the following commands.\n", + "Now you can `cd` into the `llama-cpp-python` directory and install the package\n", "\n", "```\n", - "python setup.py clean\n", - "python setup.py install\n", + "python -m pip install -e .\n", "```" ] }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**IMPORTANT**: If you have already installed a cpu only version of the package, you need to reinstall it from scratch: consider the following command: " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "!python -m pip install -e . --force-reinstall --no-cache-dir" + ] + }, { "cell_type": "markdown", "metadata": {},