Spaces:
Running
on
Zero
Running
on
Zero
File size: 2,870 Bytes
886d8e9 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 |
---
title: AWS Sagemaker
---
To use Open Interpreter with a model from AWS Sagemaker, set the `model` flag:
<CodeGroup>
```bash Terminal
interpreter --model sagemaker/<model-name>
```
```python Python
# Sagemaker requires boto3 to be installed on your machine:
!pip install boto3
from interpreter import interpreter
interpreter.llm.model = "sagemaker/<model-name>"
interpreter.chat()
```
</CodeGroup>
# Supported Models
We support the following completion models from AWS Sagemaker:
- Meta Llama 2 7B
- Meta Llama 2 7B (Chat/Fine-tuned)
- Meta Llama 2 13B
- Meta Llama 2 13B (Chat/Fine-tuned)
- Meta Llama 2 70B
- Meta Llama 2 70B (Chat/Fine-tuned)
- Your Custom Huggingface Model
<CodeGroup>
```bash Terminal
interpreter --model sagemaker/jumpstart-dft-meta-textgeneration-llama-2-7b
interpreter --model sagemaker/jumpstart-dft-meta-textgeneration-llama-2-7b-f
interpreter --model sagemaker/jumpstart-dft-meta-textgeneration-llama-2-13b
interpreter --model sagemaker/jumpstart-dft-meta-textgeneration-llama-2-13b-f
interpreter --model sagemaker/jumpstart-dft-meta-textgeneration-llama-2-70b
interpreter --model sagemaker/jumpstart-dft-meta-textgeneration-llama-2-70b-b-f
interpreter --model sagemaker/<your-hugginface-deployment-name>
```
```python Python
interpreter.llm.model = "sagemaker/jumpstart-dft-meta-textgeneration-llama-2-7b"
interpreter.llm.model = "sagemaker/jumpstart-dft-meta-textgeneration-llama-2-7b-f"
interpreter.llm.model = "sagemaker/jumpstart-dft-meta-textgeneration-llama-2-13b"
interpreter.llm.model = "sagemaker/jumpstart-dft-meta-textgeneration-llama-2-13b-f"
interpreter.llm.model = "sagemaker/jumpstart-dft-meta-textgeneration-llama-2-70b"
interpreter.llm.model = "sagemaker/jumpstart-dft-meta-textgeneration-llama-2-70b-b-f"
interpreter.llm.model = "sagemaker/<your-hugginface-deployment-name>"
```
</CodeGroup>
# Required Environment Variables
Set the following environment variables [(click here to learn how)](https://chat.openai.com/share/1062cdd8-62a1-4aa8-8ec9-eca45645971a) to use these models.
| Environment Variable | Description | Where to Find |
| ----------------------- | ----------------------------------------------- | ----------------------------------------------------------------------------------- |
| `AWS_ACCESS_KEY_ID` | The API access key for your AWS account. | [AWS Account Overview -> Security Credentials](https://console.aws.amazon.com/) |
| `AWS_SECRET_ACCESS_KEY` | The API secret access key for your AWS account. | [AWS Account Overview -> Security Credentials](https://console.aws.amazon.com/) |
| `AWS_REGION_NAME` | The AWS region you want to use | [AWS Account Overview -> Navigation bar -> Region](https://console.aws.amazon.com/) |
|