Zach Nussbaum
|
15f7c5b68f
|
chore: peft
|
2023-04-12 03:50:54 +00:00 |
|
Zach
|
311c818934
|
feat: evals on new gptj models
|
2023-04-10 02:14:20 +00:00 |
|
Zach Nussbaum
|
7807a80bbb
|
fix: bs try one more time?
|
2023-04-08 21:47:07 +00:00 |
|
Zach Nussbaum
|
2f0eba211d
|
fix: smaller bs for 40gb
|
2023-04-08 21:36:20 +00:00 |
|
Zach Nussbaum
|
c82ee7d882
|
fix: add wd + min lr to config
|
2023-04-08 20:37:51 +00:00 |
|
Zach Nussbaum
|
b66f127ade
|
fix: config + ignore pkl
|
2023-04-08 20:33:02 +00:00 |
|
zanussbaum
|
147c2fd7eb
|
feat: lora gptj
|
2023-04-07 17:53:07 -04:00 |
|
zanussbaum
|
2b001e8932
|
fix: batch size
|
2023-04-07 17:41:45 -04:00 |
|
zanussbaum
|
7cfda6a21f
|
feat: update for mosaic
|
2023-04-07 16:54:29 -04:00 |
|
Zach
|
5baead45be
|
fix: configs
|
2023-04-05 20:42:35 +00:00 |
|
Zach Nussbaum
|
0a3834d086
|
fix: gptj multinode
|
2023-04-05 02:52:44 +00:00 |
|
Zach Nussbaum
|
5c5f41ba36
|
fix: clean up data, pad at end
|
2023-04-04 20:53:23 +00:00 |
|
Zach Nussbaum
|
07798cfbd2
|
Update finetune_lora.yaml
|
2023-03-28 20:58:33 -07:00 |
|
Zach Nussbaum
|
fdb15fc8f1
|
Update finetune.yaml
|
2023-03-28 20:58:03 -07:00 |
|
Zach Nussbaum
|
ed08006053
|
feat: lora config
|
2023-03-27 17:33:13 +00:00 |
|
Zach Nussbaum
|
80227dc164
|
feat: full model config
|
2023-03-27 17:30:57 +00:00 |
|
Zach Nussbaum
|
2daecd6066
|
feat: update embeddings
|
2023-03-26 17:45:21 +00:00 |
|
Zach Nussbaum
|
723a50bdf1
|
feat: train and clean data
|
2023-03-25 16:17:48 +00:00 |
|