|
--- |
|
language: |
|
- en |
|
base_model: |
|
- mistralai/Codestral-22B-v0.1 |
|
pipeline_tag: text-generation |
|
tags: |
|
- code |
|
- code-generation |
|
--- |
|
This repository contains AWS Inferentia2 and neuronx compatible checkpoints for [Codestral-22B-v0.1](https://huggingface.co/mistralai/Codestral-22B-v0.1). You can find detailed information about the base model on its [Model Card](https://huggingface.co/mistralai/Codestral-22B-v0.1). |
|
|
|
This model has been exported to the neuron format using specific input_shapes and compiler parameters detailed in the paragraphs below. |
|
|
|
It has been compiled to run on an inf2.24xlarge instance on AWS. Note that while the inf2.24xlarge has 12 cores, this compilation uses 12. |
|
|
|
- SEQUENCE_LENGTH = 4096 |
|
- BATCH_SIZE = 4 |
|
- NUM_CORES = 12 |
|
- PRECISION = "bf16" |
|
|
|
--- |
|
license: mnpl |
|
--- |