Safetensors
English
Chinese

Context length?

#4
by vishaal27 - opened

Hi, great work, and great paper, really enjoyed reading it?

I couldn't find what context length you trained your models at? From the InternVL2.5 repo, it seems like they used a context length of 16384 for training, do you also use the same? Thanks!

Sign up or log in to comment