seen_tokens and get_max_length depriciated

#8
  1. The seen_tokens attribute is deprecated and will be removed in v4.41. Use the cache_position model input instead.

  2. AttributeError: 'DynamicCache' object has no attribute 'get_max_length'

    past_length = past_key_values.cache_position
    max_cache_length = past_key_values.get_max_cache_shape()

I my case (transformers v4.49) this one worked:

            past_length = past_key_values.seen_tokens
            max_cache_length = past_key_values.get_max_cache_shape()
Ready to merge
This branch is ready to get merged automatically.

Sign up or log in to comment