mirror of
https://github.com/nomic-ai/gpt4all
synced 2024-11-02 09:40:42 +00:00
Merge pull request #96 from eltociear/patch-1
Fix typo in TRAINING_LOG.md
This commit is contained in:
commit
8e7ce1f7c7
@ -160,7 +160,7 @@ We realized that we had two bugs however:
|
||||
- We accidentally duplicated data and effectively trained for 2 epochs instead of 1
|
||||
- We added an eos token to every sequence, even those that we truncated (e.g. long code that exceeds the 1024).
|
||||
|
||||
## Conditonal EOS and 1 Epoch
|
||||
## Conditional EOS and 1 Epoch
|
||||
|
||||
Using the same parameters, we then trained a model using a "conditional" eos token where we only add an `eos` when the inputs are less than the maximum sequence length for one epoch.
|
||||
|
||||
|
Loading…
Reference in New Issue
Block a user