mirror of
https://github.com/nomic-ai/gpt4all
synced 2024-11-02 09:40:42 +00:00
Fix typo in TRAINING_LOG.md
Conditonal -> Conditional
This commit is contained in:
parent
708e0b486d
commit
5556de9152
@ -160,7 +160,7 @@ We realized that we had two bugs however:
|
||||
- We accidentally duplicated data and effectively trained for 2 epochs instead of 1
|
||||
- We added an eos token to every sequence, even those that we truncated (e.g. long code that exceeds the 1024).
|
||||
|
||||
## Conditonal EOS and 1 Epoch
|
||||
## Conditional EOS and 1 Epoch
|
||||
|
||||
Using the same parameters, we then trained a model using a "conditional" eos token where we only add an `eos` when the inputs are less than the maximum sequence length for one epoch.
|
||||
|
||||
|
Loading…
Reference in New Issue
Block a user