Commit History
add utils.data.prepare_dataset
2e22404
use context manager to run things on rank0 before others (#397)
fc2d6be
unverified
Feat(config): add max steps (#387)
3c2ad00
unverified
save tokenizer before training starts (#380)
86a91e2
unverified
simplify `load_tokenizer`
efb3b2c
improve GPU logging to break out pytorch cache and system mem
7b55fe6
extract module for working with cfg
8cec513
Attention mask and position id fixes for packing (#285)
2bb0b78
unverified
Fix(save): Save as safetensors (#363)
a276c9c
unverified
feat(merge): save tokenizer on merge (#362)
289d5c4
unverified
Merge pull request #356 from tmm1/load_model-args
11ddccb
unverified
simplify load_model signature
7181022
log GPU memory usage
e303d64
fix FSDP save of final model (#329)
894cba0
unverified
add runpod envs to .bashrc, fix bnb env (#316)
cf62cfd
unverified
misc fixes
d75adb9
Fixed pre-commit problems, fixed small bug in logging_config to handle LOG_LEVEL env var
b1f4f7a
Adding logging enhancement
553a86b
Merge pull request #92 from OpenAccess-AI-Collective/flash-optimum
16bb627
unverified
chore: Refactor inf_kwargs out
dc77c8e
Merge branch 'main' into flash-optimum
fd2c981
unverified
Merge pull request #177 from NanoCode012/fix/landmark-patch
8002ffb
unverified
Merge pull request #159 from AngainorDev/patch-1
8e568bb
unverified
Fix strict and Lint
b565ecf
Fix set mem_id for inference and refactor
974dc00
Set mem cache args on inference
572d114
fix formatting
958da70
pass a prompt in from stdin for inference
c4e4f81
address PR feedback
0c6f928
add streaming dataset support for pretraining datasets
eea2731
more tweaks to do pre-training with bettertransformers
1210dc8
experimental expansion of ctx len
488a67d
add flash attn context for efficient training and attempt setting model to train mode:
8792199
add support for opimum bettertransformers
1edc30c
Merge branch 'main' into patch-1
79e2a6f
unverified
Angainor Development
commited on
Remove explicit definition of cfg.inference
c250898
unverified
Angainor Development
commited on
formatting for linter
f36e227
unverified
Add streaming inference & fix stopping at EOS
fec6bcc
Feed cfg.inference
bd3b537
unverified
Angainor Development
commited on