gpt4all/configs/generate/generate_llama.yaml

14 lines
269 B
YAML
Raw Normal View History

2023-03-25 21:45:58 +00:00
# model/tokenizer
model_name: "zpn/llama-7b"
tokenizer_name: "zpn/llama-7b"
max_new_tokens: 512
temperature: 0
prompt: |
#this code prints a string reversed
my_string = "hello how are you"
print(len(my_string))
My code above does not work. Can you help me?