Interface: TextGenerationStreamPrefillToken
Properties
id
• id: number
Token ID from the model tokenizer
Defined in
inference/src/tasks/nlp/textGenerationStream.ts:21
logprob
• Optional
logprob: number
Logprob Optional since the logprob of the first token cannot be computed
Defined in
inference/src/tasks/nlp/textGenerationStream.ts:28
text
• text: string
Token text
Defined in
inference/src/tasks/nlp/textGenerationStream.ts:23
< > Update on GitHub