Commit History

Update README.md
80c0ba9

gugarosa commited on

Disables inference API to prevent mismatch with HF implementation.
a286f5c

gugarosa commited on

fix(modeling_phi): Fixes initial generation with length larger than context length.
ca573e3

gugarosa commited on

fix(modeling_phi): Fixes cached generation when above maximum context length.
37527ba

gugarosa commited on

Fixes exceeding maximum sequence length when using generate().
5fd430c

gugarosa commited on

Delete modeling_mixformer_sequential.py
d212a78

gugarosa commited on

Delete configuration_mixformer_sequential.py
8e9ebfb

gugarosa commited on

Update to new model interface.
271c339

gugarosa commited on

Improves type hinting on configuration arguments.
92557d0

gugarosa commited on

Enables to toggle fused_dense, flash_rotary and attn_pdrop in the configuration.
45f4b21

gugarosa commited on

Fixes flash-attn import with a try/except statement
0254d42

gugarosa commited on

Adds support for flash-attn rotary embedding and fused dense layers.
0bbd68a

gugarosa commited on

Adds support for MQA/GQA and attention mask during training.
de35f90

gugarosa commited on

Update modeling_mixformer_sequential.py
d38e6f9

gugarosa commited on

Adding _set_gradient_checkpointing for compatibility (#22)
8091327

gugarosa vriveras commited on

Upload modeling_mixformer_sequential.py
b6a7e2f

gugarosa commited on

Add more precise license metadata (UI will be cleaner!) (#35)
8ab0f29

gugarosa julien-c HF staff commited on

Upload README.md
bc09a08

gugarosa commited on

fix(phi-1_5): Checks length of `attention_mask`if it is passed as direct tensor.
f9f2ac7

gugarosa commited on

Support for `attention_mask` in forward pass.
3128bb6

gugarosa commited on

Upload MixFormerSequentialForCausalLM
d655135

suriyagunasekar commited on

Upload MixFormerSequentialForCausalLM
e656142

suriyagunasekar commited on

Upload MixFormerSequentialForCausalLM
2bfd6ef

suriyagunasekar commited on

Upload MixFormerSequentialForCausalLM
ba44a90

suriyagunasekar commited on

Upload MixFormerSequentialForCausalLM
1698206

suriyagunasekar commited on