Add colab-related changes (#80)

Add some stuff to work on COLAB more comfortable.

Co-authored-by: Alexander Borzunov <hxrussia@gmail.com>
pull/85/head
Artem Chumachenko 2 years ago committed by GitHub
parent 87fd6a4f08
commit 2cb82dd648
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

@ -13,7 +13,9 @@
"\n",
"In this example, we show how to use [prompt tuning](https://aclanthology.org/2021.emnlp-main.243.pdf) to adapt a test 6B version of the [BLOOM](https://huggingface.co/bigscience/bloom) model for a specific downstream task. We will run this model in a decentralized fashion using [Petals](https://github.com/bigscience-workshop/petals). Petals servers will maintain the BLOOM blocks (they are kept unchanged during adaptation), and the gradient descent will learn a few prefix tokens stored on a Petals client.\n",
"\n",
"We will adapt the BLOOM model for the chatbot task using the [Personachat](https://huggingface.co/datasets/bavard/personachat_truecased) dataset. For a given dialogue context, the model has to provide a relevant answer."
"We will adapt the BLOOM model for the chatbot task using the [Personachat](https://huggingface.co/datasets/bavard/personachat_truecased) dataset. For a given dialogue context, the model has to provide a relevant answer.\n",
"\n",
"To open this notebook in colab: [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/bigscience-workshop/petals/blob/main/examples/prompt-tuning-personachat.ipynb)"
]
},
{
@ -24,6 +26,31 @@
"First, we have to prepare all dependencies."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "73bbc648",
"metadata": {},
"outputs": [],
"source": [
"# This block is only need for colab users. It will change nothing if you are running this notebook locally.\n",
"import subprocess\n",
"import sys\n",
"\n",
"\n",
"IN_COLAB = 'google.colab' in sys.modules\n",
"\n",
"if IN_COLAB:\n",
" subprocess.run(['git', 'clone', 'https://github.com/bigscience-workshop/petals'])\n",
" subprocess.run(['pip', 'install', '-r', 'petals/requirements.txt'])\n",
" subprocess.run(['pip', 'install', 'datasets', 'lib64'])\n",
"\n",
" try:\n",
" subprocess.check_output([\"nvidia-smi\", \"-L\"])\n",
" except subprocess.CalledProcessError as e:\n",
" subprocess.run(['rm', '-r', '/usr/local/cuda/lib64'])"
]
},
{
"cell_type": "code",
"execution_count": null,
@ -33,7 +60,7 @@
"source": [
"import os\n",
"import sys\n",
"sys.path.insert(0, \"..\")\n",
"sys.path.insert(0, \"..\") # for colab change to sys.path.insert(0, './petals/')\n",
" \n",
"import torch\n",
"import transformers\n",
@ -285,7 +312,7 @@
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"display_name": "Python 3.8.10 64-bit",
"language": "python",
"name": "python3"
},
@ -299,7 +326,12 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.8.12"
"version": "3.8.9"
},
"vscode": {
"interpreter": {
"hash": "31f2aee4e71d21fbe5cf8b01ff0e069b9275f58929596ceb00d14d90e3e16cd6"
}
}
},
"nbformat": 4,

Loading…
Cancel
Save