gpt4all/configs/generate/generate_llama.yaml
Zach Nussbaum b8808927bf Update generate_llama.yaml
fix: paths
2023-03-28 13:08:56 -07:00

15 lines
308 B
YAML

# model/tokenizer
model_name: # REPLACE WITH LLAMA MODEL NAME
tokenizer_name: # REPLACE WITH LLAMA MODEL NAME
max_new_tokens: 512
temperature: 0.001
prompt: |
#this code prints a string reversed
my_string = "hello how are you"
print(len(my_string))
My code above does not work. Can you help me?