@ -10,7 +10,7 @@
# Reproducibility
You can find trained LoRa model weights at:
- gpt4all-lora https://huggingface.co/nomic-ai/vicuna-lora-multi-turn
- gpt4all-lora https://huggingface.co/nomic-ai/gpt4all-lora
We are not distributing LLaMa 7B checkpoint they need to be used in association with.