Context Length?
#1
by
deleted
- opened
It's 32k according to config.json.
Hi, the context length that we used to train v2.4 models is 8k. You can refer to here for more details about all our models.
deleted
changed discussion status to
closed