Managing large-scale AI models like DeepSeek-R1-Distill-Llama-70B requires efficient cloud-based workflows. This guide walks you through deploying the model to AWS Bedrock, covering EC2 setup, Git LFS handling, S3 upload, and final import into AWS Bedrock.

1. Create an EC2 Instance
Since the model size is around 260GB, it is best to use an AWS EC2 instance for cloning and uploading the model instead of your local machine.
- Choose an EC2 instance
- Provide at least 300GB of EBS storage to accommodate the full model and ensure smooth operations.
- Ensure the EC2 instance is in a region that supports AWS Bedrock Imported Models (e.g.,
us-east-1
).
2. Install Git LFS and AWS CLI
Once your EC2 instance is up, install the necessary tools:
# Update system packages
sudo apt update && sudo apt install -y curl unzip git-lfs
# Install AWS CLI
curl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o "awscliv2.zip"
unzip awscliv2.zip
sudo ./aws/install
# Verify installation
aws --version
Initialize Git LFS:
git lfs install
3. Clone the Repository in Stages
Instead of downloading everything at once, clone the metadata first:
git clone https://huggingface.co/deepseek-ai/DeepSeek-R1-Distill-Llama-70B
cd DeepSeek-R1-Distill-Llama-70B
Then, start downloading large files in the background using nohup
:
nohup git lfs pull &> clone.log &
Monitor progress with:
tail -f clone.log
4. Verify the Repository Download
Once git lfs
pull completes:
- Ensure no LFS-related processes are still running:
jobs -l
If no output appears, the process has completed.
- Run additional commands for double confirmation:
git lfs pull
git lfs fetch --all
git lfs checkout
- Verify the total folder size:
du -sh DeepSeek-R1-Distill-Llama-70B
Ideally, the repository size should be around 270GB.
5. Configure IAM Role for EC2 to Access S3
To allow EC2 to upload files to S3, create an IAM role with the AmazonS3FullAccess
policy:
- Go to AWS IAM Console → Roles → Create Role.
- Select EC2 as the trusted entity.
- Attach the AmazonS3FullAccess policy.
- Name the role (
EC2-S3-Access
) and create it. - Attach this role to your EC2 instance.
6. Create an S3 Bucket
Since AWS Bedrock Imported Models requires the model to be in S3, create an S3 bucket in the same region as your EC2 instance:
aws s3 mb s3://deepseek-model-bucket --region us-east-1
7. Upload the Model to S3
Use the following command to upload the entire repo, running it in the background:
nohup aws s3 sync DeepSeek-R1-Distill-Llama-70B s3://deepseek-model-bucket --progress &> upload.log &
Monitor the progress using:
tail -f upload.log
Uploading 270GB should take around 30-40 minutes, depending on your instance and network speed.
8. Import Model into AWS Bedrock
Once the upload is complete:
- Go to AWS Bedrock Console → Imported Models.
- Click Import Model.
- Enter the required details.
- Select your S3 bucket and the correct folder where the model is stored.
- Click Import Model.
A new import job will start. This will take some time, depending on AWS Bedrock’s processing capacity. Once completed, your DeepSeek Distilled model will be available for use in Bedrock.
9. Interacting with the Imported Model
Now, once the import is finished, it will be listed in the Imported Models section. Click on it, and then select Open in Playground. This will allow you to test the model interactively.
Important: Ensure that the Response Length setting is set to Max in the Playground. Otherwise, you may encounter the error:
Model is not ready for inference. Wait and try your request again. Refer to https://docs.aws.amazon.com/bedrock/latest/userguide/invoke-imported-model.html#handle-model-not-ready-exception.
Once the settings are adjusted, you can start using the model for inference.
Final Notes
- AWS Bedrock Imported Models is not available in all regions. Ensure your EC2 instance and S3 bucket are in a supported region (e.g.,
us-east-1
). - Use
nohup
for all long-running processes to prevent interruptions. - Double-check permissions to avoid access issues when uploading to S3.
Once the model is imported, you are ready to deploy and use DeepSeek-R1-Distill-Llama-70B within AWS Bedrock! 🚀