Prerequisites
- AWS account with Bedrock access
- Nanonets-OCR model files stored in an S3 bucket
- AWS CLI or Console access
- boto3 Python library installed (
pip install boto3)
Part 1: Create IAM Service Role
Before importing the model, you need to create an IAM service role that allows Bedrock to access your S3 bucket.Step 1.1: Create the Service Role
- Navigate to IAM Console: https://console.aws.amazon.com/iam/
- Go to Roles → Create role
- Select AWS service as trusted entity type
- Choose Bedrock from the service list
- Click Next
Step 1.2: Add Permissions Policy
Create a custom policy with the following permissions:<YOUR_BUCKET_NAME>: Your S3 bucket name (e.g.,my-models-bucket)<YOUR_AWS_ACCOUNT_ID>: Your AWS account ID (e.g.,123456789012)
Step 1.3: Configure Trust Relationship
Edit the trust relationship to include:<YOUR_AWS_ACCOUNT_ID>: Your AWS account ID<YOUR_REGION>: Your AWS region (e.g.,us-west-2)
Step 1.4: Name and Create Role
- Name the role (e.g.,
BedrockModelImportRole) - Add a description
- Click Create role
- Note down the Role ARN for later use
Part 2: Import Nanonets-OCR Model to Bedrock
Step 2.1: Navigate to Import Models Page
Go to Amazon Bedrock Import Models page:<YOUR_REGION> with your region (e.g., us-west-2)
Step 2.2: Start Model Import
- Click Import model button
Step 2.3: Configure Model Details
- Model name: Enter a descriptive name for your model
- Example:
nanonets-ocr-v1
- Example:
- Model import settings:
- Model import source: Select Amazon S3 bucket
-
S3 URI: Enter the path to your model files
- Example:
s3://my-models-bucket/nanonets-ocr-model/
- Example:
Step 2.4: Configure Service Access
- Under Service access, select Use existing service role
- Choose the IAM role created in Part 1 (e.g.,
BedrockModelImportRole)
Step 2.5: Import the Model
- Review all settings
- Click Import model button
Step 2.6: Monitor Import Job
- The import job will be created and take approximately 10-15 minutes to complete
- You can monitor the progress on the Import Models page
- The status will change from “In Progress” to “Completed”
Step 2.7: Locate Your Imported Model
-
Once completed, go to the Imported models tab:
- Click on your model name to view details
-
Note down the Model ARN - you’ll need this for inference
- Format:
arn:aws:bedrock:<YOUR_REGION>:<YOUR_ACCOUNT_ID>:imported-model/<MODEL_ID>
- Format:
Part 3: Use Model for Inference
Step 3.1: Set Up Python Environment
Install required dependencies:Step 3.2: Configure AWS Credentials
Ensure your AWS credentials are configured:Step 3.3: Create Inference Script
Create a Python filebedrock_inference.py:
<YOUR_REGION>: Your AWS region (e.g.,us-west-2)<YOUR_ACCOUNT_ID>: Your AWS account ID<MODEL_ID>: Your imported model ID<PATH_TO_YOUR_IMAGE>: Path to your test image
Step 3.4: Run Inference
Required IAM Permissions for Inference
The user/role running inference needs:Notes
- Model import typically takes 10-15 minutes
- Imported models can be deleted from the Bedrock console if no longer needed
- You are charged for model storage and inference requests
- Model inference latency depends on model size and complexity