seen_tokens and get_max_length depriciated
#8
by
singleffabric
- opened
The
seen_tokens
attribute is deprecated and will be removed in v4.41. Use thecache_position
model input instead.AttributeError: 'DynamicCache' object has no attribute 'get_max_length'
past_length = past_key_values.cache_position
max_cache_length = past_key_values.get_max_cache_shape()
I my case (transformers v4.49) this one worked:
past_length = past_key_values.seen_tokens
max_cache_length = past_key_values.get_max_cache_shape()