--- library_name: jax license: gemma pipeline_tag: text-generation tags: - gemma_jax extra_gated_heading: Access Gemma on Hugging Face extra_gated_prompt: To access Gemma on Hugging Face, you’re required to review and agree to Google’s usage license. To do this, please ensure you’re logged-in to Hugging Face and click below. Requests are processed immediately. extra_gated_button_content: Acknowledge license --- # Gemma Model Card > [!IMPORTANT] > > This repository corresponds to the research Gemma repository in Jax. If you're looking for the transformers JAX implementation, visit [this page](https://huggingface.co/google/gemma-2b-it). **Model Page**: [Gemma](https://ai.google.dev/gemma/docs) This model card corresponds to the 2B instruct version of the Gemma model for usage with flax. For more information about the model, visit https://huggingface.co/google/gemma-2b-it. **Resources and Technical Documentation**: * [Responsible Generative AI Toolkit](https://ai.google.dev/responsible) * [Gemma on Kaggle](https://www.kaggle.com/models/google/gemma) * [Gemma on Vertex Model Garden](https://console.cloud.google.com/vertex-ai/publishers/google/model-garden/335?version=gemma-2b-gg-hf) **Terms of Use**: [Terms](https://www.kaggle.com/models/google/gemma/license/consent/verify/huggingface?returnModelRepoId=google/gemma-2b-it-flax) **Authors**: Google ## Loading the model To download the weights and tokenizer, run: ```python from huggingface_hub import snapshot_download local_dir = snapshot_download(repo_id="google/gemma-2b-it-flax", local_dir=local_dir) ``` Then download [this script](https://github.com/google-deepmind/gemma/blob/main/examples/sampling.py) from the [gemma GitHub repository](https://github.com/google-deepmind/gemma) and call `python sampling.py` with the `--path_checkpoint` and `--path_tokenizer` arguments pointing to your local download path.