mirror of
https://github.com/nomic-ai/gpt4all
synced 2024-11-06 09:20:33 +00:00
Fix typo in TRAINING_LOG.md
Conditonal -> Conditional
This commit is contained in:
parent
f7d4b4bde7
commit
8ac7c1a9fe
@ -160,7 +160,7 @@ We realized that we had two bugs however:
|
|||||||
- We accidentally duplicated data and effectively trained for 2 epochs instead of 1
|
- We accidentally duplicated data and effectively trained for 2 epochs instead of 1
|
||||||
- We added an eos token to every sequence, even those that we truncated (e.g. long code that exceeds the 1024).
|
- We added an eos token to every sequence, even those that we truncated (e.g. long code that exceeds the 1024).
|
||||||
|
|
||||||
## Conditonal EOS and 1 Epoch
|
## Conditional EOS and 1 Epoch
|
||||||
|
|
||||||
Using the same parameters, we then trained a model using a "conditional" eos token where we only add an `eos` when the inputs are less than the maximum sequence length for one epoch.
|
Using the same parameters, we then trained a model using a "conditional" eos token where we only add an `eos` when the inputs are less than the maximum sequence length for one epoch.
|
||||||
|
|
||||||
|
Loading…
Reference in New Issue
Block a user