diff --git a/README.md b/README.md index 351a8de9..d33c9c96 100644 --- a/README.md +++ b/README.md @@ -99,7 +99,7 @@ Trained LoRa Weights: Raw Data: - [Training Data Without P3](https://huggingface.co/datasets/nomic-ai/gpt4all_prompt_generations) - Explorer: https://atlas.nomic.ai/map/gpt4all_data_clean_without_p3 -- [Full Dataset with P3](https://s3.amazonaws.com/static.nomic.ai/gpt4all/2022_03_27/gpt4all_curated_data_full_2022_03_27.tar.gz) +- [Full Dataset with P3](https://huggingface.co/datasets/nomic-ai/gpt4all_prompt_generations_with_p3) - Explorer: https://atlas.nomic.ai/map/gpt4all_data_clean We are not distributing a LLaMa 7B checkpoint.