Skip to main content

Prerequisites

  • AWS account with Bedrock access
  • Nanonets-OCR model files stored in an S3 bucket
  • AWS CLI or Console access
  • boto3 Python library installed (pip install boto3)

Part 1: Create IAM Service Role

Before importing the model, you need to create an IAM service role that allows Bedrock to access your S3 bucket.

Step 1.1: Create the Service Role

  1. Navigate to IAM Console: https://console.aws.amazon.com/iam/
  2. Go to RolesCreate role
  3. Select AWS service as trusted entity type
  4. Choose Bedrock from the service list
  5. Click Next

Step 1.2: Add Permissions Policy

Create a custom policy with the following permissions:
{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "BedrockS3Access",
            "Effect": "Allow",
            "Action": [
                "s3:GetObject",
                "s3:ListBucket"
            ],
            "Resource": [
                "arn:aws:s3:::<YOUR_BUCKET_NAME>",
                "arn:aws:s3:::<YOUR_BUCKET_NAME>/*"
            ],
            "Condition": {
                "StringEquals": {
                    "aws:ResourceAccount": "<YOUR_AWS_ACCOUNT_ID>"
                }
            }
        }
    ]
}
Replace:
  • <YOUR_BUCKET_NAME>: Your S3 bucket name (e.g., my-models-bucket)
  • <YOUR_AWS_ACCOUNT_ID>: Your AWS account ID (e.g., 123456789012)

Step 1.3: Configure Trust Relationship

Edit the trust relationship to include:
{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "BedrockAssumeRole",
            "Effect": "Allow",
            "Principal": {
                "Service": "bedrock.amazonaws.com"
            },
            "Action": "sts:AssumeRole",
            "Condition": {
                "StringEquals": {
                    "aws:SourceAccount": "<YOUR_AWS_ACCOUNT_ID>"
                },
                "ArnEquals": {
                    "aws:SourceArn": "arn:aws:bedrock:<YOUR_REGION>:<YOUR_AWS_ACCOUNT_ID>:model-import-job/*"
                }
            }
        }
    ]
}
Replace:
  • <YOUR_AWS_ACCOUNT_ID>: Your AWS account ID
  • <YOUR_REGION>: Your AWS region (e.g., us-west-2)

Step 1.4: Name and Create Role

  1. Name the role (e.g., BedrockModelImportRole)
  2. Add a description
  3. Click Create role
  4. Note down the Role ARN for later use

Part 2: Import Nanonets-OCR Model to Bedrock

Step 2.1: Navigate to Import Models Page

Go to Amazon Bedrock Import Models page:
https://<YOUR_REGION>.console.aws.amazon.com/bedrock/home?region=<YOUR_REGION>#/import-models
Replace <YOUR_REGION> with your region (e.g., us-west-2)

Step 2.2: Start Model Import

  1. Click Import model button

Step 2.3: Configure Model Details

  1. Model name: Enter a descriptive name for your model
    • Example: nanonets-ocr-v1
  2. Model import settings:
    • Model import source: Select Amazon S3 bucket
    • S3 URI: Enter the path to your model files
      s3://<YOUR_BUCKET_NAME>/<MODEL_FOLDER_PATH>/
      
      • Example: s3://my-models-bucket/nanonets-ocr-model/

Step 2.4: Configure Service Access

  1. Under Service access, select Use existing service role
  2. Choose the IAM role created in Part 1 (e.g., BedrockModelImportRole)

Step 2.5: Import the Model

  1. Review all settings
  2. Click Import model button

Step 2.6: Monitor Import Job

  • The import job will be created and take approximately 10-15 minutes to complete
  • You can monitor the progress on the Import Models page
  • The status will change from “In Progress” to “Completed”

Step 2.7: Locate Your Imported Model

  1. Once completed, go to the Imported models tab:
    https://<YOUR_REGION>.console.aws.amazon.com/bedrock/home?region=<YOUR_REGION>#/import-models
    
  2. Click on your model name to view details
  3. Note down the Model ARN - you’ll need this for inference
    • Format: arn:aws:bedrock:<YOUR_REGION>:<YOUR_ACCOUNT_ID>:imported-model/<MODEL_ID>

Part 3: Use Model for Inference

Step 3.1: Set Up Python Environment

Install required dependencies:
pip install boto3

Step 3.2: Configure AWS Credentials

Ensure your AWS credentials are configured:
aws configure
Or set environment variables:
export AWS_ACCESS_KEY_ID=<YOUR_ACCESS_KEY>
export AWS_SECRET_ACCESS_KEY=<YOUR_SECRET_KEY>
export AWS_DEFAULT_REGION=<YOUR_REGION>

Step 3.3: Create Inference Script

Create a Python file bedrock_inference.py:
import boto3
import json
import base64
import time
from pathlib import Path

# Initialize Bedrock client
bedrock = boto3.client("bedrock-runtime", region_name="<YOUR_REGION>")

# Your imported model ARN
MODEL_ID = "arn:aws:bedrock:<YOUR_REGION>:<YOUR_ACCOUNT_ID>:imported-model/<MODEL_ID>"

# Load image (for vision models)
image_path = "<PATH_TO_YOUR_IMAGE>"
image_b64 = base64.b64encode(Path(image_path).read_bytes()).decode()

# Prepare request body
body = json.dumps({
    "messages": [{
        "role": "user",
        "content": [
            {
                "type": "image_url",
                "image_url": {
                    "url": f"data:image/jpeg;base64,{image_b64}"
                }
            },
            {
                "type": "text",
                "text": "Extract all information from this invoice as clean markdown."
            }
        ]
    }],
    "max_tokens": 20000,
    "temperature": 0.7
})

# Make inference request
start_time = time.time()
response = bedrock.invoke_model(
    modelId=MODEL_ID,
    body=body,
    contentType="application/json",
    accept="application/json"
)
elapsed_time = time.time() - start_time

# Parse response
result = json.loads(response["body"].read())

# Print results
print(result["choices"][0]["message"]["content"])
print(f"\n⏱️  Response time: {elapsed_time:.2f}s")
Replace:
  • <YOUR_REGION>: Your AWS region (e.g., us-west-2)
  • <YOUR_ACCOUNT_ID>: Your AWS account ID
  • <MODEL_ID>: Your imported model ID
  • <PATH_TO_YOUR_IMAGE>: Path to your test image

Step 3.4: Run Inference

python bedrock_inference.py

Required IAM Permissions for Inference

The user/role running inference needs:
{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "bedrock:InvokeModel"
            ],
            "Resource": "arn:aws:bedrock:<YOUR_REGION>:<YOUR_ACCOUNT_ID>:imported-model/*"
        }
    ]
}

Notes

  • Model import typically takes 10-15 minutes
  • Imported models can be deleted from the Bedrock console if no longer needed
  • You are charged for model storage and inference requests
  • Model inference latency depends on model size and complexity

Additional Resources