Join Amazon Bedrock brokers to cross-account information bases


Organizations want seamless entry to their structured information repositories to energy clever AI brokers. Nonetheless, when these sources span a number of AWS accounts integration challenges can come up. This put up explores a sensible resolution for connecting Amazon Bedrock brokers to information bases in Amazon Redshift clusters residing in several AWS accounts.

The problem

Organizations that construct AI brokers utilizing Amazon Bedrock can preserve their structured information in Amazon Redshift clusters. When these information repositories exist in separate AWS accounts from their AI brokers, they face a major limitation: Amazon Bedrock Knowledge Bases doesn’t natively help cross-account Redshift integration.

This creates a problem for enterprises with multi-account architectures who need to:

  • Leverage current structured information in Redshift for his or her AI brokers.
  • Keep separation of issues throughout totally different AWS accounts.
  • Keep away from duplicating information throughout accounts.
  • Guarantee correct safety and entry controls.

Resolution overview

Our resolution allows cross-account information base integration by a safe, serverless structure that maintains safe entry controls whereas permitting AI brokers to question structured information. The strategy makes use of AWS Lambda as an middleman to facilitate safe cross-account information entry.

Cross-Account Amazon Bedrock knowledge base architecture

The motion stream as proven above:

  1. Customers enter their pure language query in Amazon Bedrock Agents which is configured within the agent account.
  2. Amazon Bedrock Brokers invokes a Lambda operate by motion teams which offers entry to the Amazon Bedrock information base configured within the agent-kb account above.
  3. Motion group Lambda operate operating in agent account assumes an IAM position created in agent-kb account above to hook up with the information base within the agent-kb account.
  4. Amazon Bedrock Information Base within the agent-kb account makes use of an IAM position created in the identical account to entry Amazon Redshift information warehouse and question information within the information warehouse.

The answer follows these key parts:

  1. Amazon Bedrock agent within the agent account that handles consumer interactions.
  2. Amazon Redshift serverless workgroup in VPC and personal subnet within the agent-kb account containing structured information.
  3. Amazon Bedrock Information base utilizing the Amazon Redshift serverless workgroup as structured information supply.
  4. Lambda operate within the agent account.
  5. Motion group configuration to attach the agent within the agent account to the Lambda operate.
  6. IAM roles and insurance policies that allow safe cross-account entry.

Stipulations

This resolution requires you to have the next:

  1. Two AWS accounts. Create an AWS account should you wouldn’t have one. Particular permissions required for each account which shall be arrange in subsequent steps.
  2. Install the AWS CLI (2.24.22 – present model)
  3. Set up authentication using IAM user credentials for the AWS CLI for every account
  4. Be sure you have jq put in, jq is light-weight command-line JSON processor. For instance, in Mac you should utilize the command brew set up jq (jq-1.7.1-apple – present model) to put in it.
  5. Navigate to the Amazon Bedrock console and be sure you allow entry to the meta.llama3-1-70b-instruct-v1:0 mannequin for the agent-kb account and entry for us.amazon.nova-pro-v1:0 mannequin within the agent account within the us-west-2, US West (Oregon) AWS Area.

Assumption

Let’s name the AWS account profile, agent profile that has the Amazon Bedrock agent. Equally, the AWS account profile be known as agent-kb that has the Amazon Bedrock information base with Amazon Redshift Serverless and the structured information supply. We’ll use the us-west-2 US West (Oregon) AWS Area however be happy to decide on one other AWS Area as obligatory (the conditions shall be relevant to the AWS Area you select to deploy this resolution in). We’ll use the meta.llama3-1-70b-instruct-v1:0 mannequin for the agent-kb. That is an out there on-demand mannequin in us-west-2. You might be free to decide on different fashions with cross-Area inference however that will imply altering the roles and polices accordingly and allow mannequin entry in all Areas they’re out there in. Based mostly on our mannequin selection for this resolution the AWS Area have to be us-west-2. For the agent we shall be utilizing an Amazon Bedrock agent optimized mannequin like us.amazon.nova-pro-v1:0.

Implementation walkthrough

The next is a step-by-step implementation information. Be certain that to carry out all steps in the identical AWS Area in each accounts.

These steps are to deploy and take a look at an end-to-end resolution from scratch and in case you are already operating a few of these parts, it’s possible you’ll skip over these steps.

    1. Make an observation of the AWS account numbers within the agent and agent-kb account. Within the implementation steps we’ll refer them as follows:
      Profile AWS account Description
      agent 111122223333 Account for the Bedrock Agent
      agent-kb 999999999999 Account for the Bedrock Information base

      Word: These steps use instance profile names and account numbers, please substitute with actuals earlier than operating.

    2. Create the Amazon Redshift Serverless workgroup within the agent-kb account:
      1. Go surfing to the agent-kb account
      2. Observe the workshop link to create the Amazon Redshift Serverless workgroup in non-public subnet
      3. Make an observation of the namespace, workgroup, and different particulars and comply with the remainder of the hands-on workshop directions.
    3. Set up your data warehouse within the agent-kb account.
    4. Create your AI knowledge base within the agent-kb account. Make an observation of the information base ID.
    5. Train your AI Assistant within the agent-kb account.
    6. Test natural language queries within the agent-kb account. You’ll find the code in aws-samples git repository: sample-for-amazon-bedrock-agent-connect-cross-account-kb.
    7. Create obligatory roles and insurance policies in each the accounts. Run the script create_bedrock_agent_kb_roles_policies.sh with the next enter parameters.
      Enter parameter Worth Description
      –agent-kb-profile agent-kb The agent knowledgebase profile that you just arrange with the AWS CLI with aws_access_key_id, aws_secret_access_key as talked about within the conditions.
      –lambda-role lambda_bedrock_kb_query_role That is the IAM position the agent account Bedrock agent motion group lambda will assume to hook up with the Redshift cross account
      –kb-access-role bedrock_kb_access_role That is the IAM position the agent-kb account which the lambda_bedrock_kb_query_role in agent account assumes to hook up with the Redshift cross account
      –kb-access-policy bedrock_kb_access_policy IAM coverage connected to the IAM position bedrock_kb_access_role
      –lambda-policy lambda_bedrock_kb_query_policy IAM coverage connected to the IAM position lambda_bedrock_kb_query_role
      –knowledge-base-id XXXXXXXXXX Substitute with the precise information base ID created in Step 4
      –agent-account 111122223333 Substitute with the 12-digit AWS account quantity the place the Bedrock agent is operating. (agent account)
      –agent-kb-account 999999999999 Substitute with the 12-digit AWS account quantity the place the Bedrock information base is operating. (agent-kb acccount)
    8. Obtain the script (create_bedrock_agent_kb_roles_policies.sh) from the aws-samples GitHub repository.
    9. Open Terminal in Mac or related bash shell for different platforms.
    10. Find and alter the listing to the downloaded location, present executable permissions:
      cd /my/location
      chmod +x create_bedrock_agent_kb_roles_policies.sh

    11. If you’re nonetheless not clear on the script utilization or inputs, then you possibly can run the script with the –assist choice and the script will show the utilization:
      ./create_bedrock_agent_kb_roles_policies.sh –assist
    12. Run the script with the fitting enter parameters as described within the earlier desk.
      ./create_bedrock_agent_kb_roles_policies.sh --agent-profile agent  
        --agent-kb-profile agent-kb  
        --lambda-role lambda_bedrock_kb_query_role  
        --kb-access-role bedrock_kb_access_role  
        --kb-access-policy bedrock_kb_access_policy  
        --lambda-policy lambda_bedrock_kb_query_policy  
        --knowledge-base-id XXXXXXXXXX  
        --agent-account 111122223333  
        --agent-kb-account 999999999999

    13. The script on profitable execution reveals the abstract of the IAM, roles and insurance policies created in each accounts.
    14. Go surfing to each the agent and agent-kb account to confirm the IAM roles and insurance policies are created.
          • For the agent account: Make an observation of the ARN of the lambda_bedrock_kb_query_role as that would be the worth of CloudFormation stack parameter AgentLambdaExecutionRoleArn within the subsequent step.
            Agent IAM Role
          • For the agent-kb account: Make an observation of the ARN of the bedrock_kb_access_role as that would be the worth of CloudFormation stack parameter TargetRoleArn within the subsequent step.
            Agent KB IAM Role
    15. Run the AWS CloudFormation script to create a Bedrock agent:
            1. Obtain the CloudFormation script: cloudformation_bedrock_agent_kb_query_cross_account.yaml from the aws-samples GitHub repository.
            2. Go surfing to the agent account and navigate to the CloudFormation console, and confirm you might be within the us-west-2 (Oregon) Area, select Create stack and select With new sources (customary).
            3. Within the Specify template part select Add a template file after which Select file and choose the file from (1). Then, select Subsequent.
            4. Enter the next stack particulars and select Subsequent.
              Parameter Worth Description
              Stack title bedrock-agent-connect-kb-cross-account-agent You may select any title
              AgentFoundationModelId us.amazon.nova-pro-v1:0 Don’t change
              AgentLambdaExecutionRoleArn arn:aws:iam:: 111122223333:position/lambda_bedrock_kb_query_role Substitute with you agent account quantity
              BedrockAgentDescription Agent to question stock information from Redshift Serverless database Maintain this as default
              BedrockAgentInstructions You might be an assistant that helps customers question stock information from our Redshift Serverless database utilizing the motion group. Don’t change
              BedrockAgentName bedrock_kb_query_cross_account Maintain this as default
              KBFoundationModelId meta.llama3-1-70b-instruct-v1:0 Don’t change
              KnowledgeBaseId XXXXXXXXXX Information base id from Step 4
              TargetRoleArn arn:aws:iam::999999999999:position/bedrock_kb_access_role Substitute with you agent-kb account quantity

            5. Full the acknowledgement and select Subsequent.
            6. Scroll down by the web page and select Submit.
            7. You will note the CloudFormation stack is getting created as proven by the standing CREATE_IN_PROGRESS.
            8. It can take a couple of minutes, and you will notice the standing change to CREATE_COMPLETE indicating creation of all sources. Select the Outputs tab to make a remark of the sources that had been created.
              In abstract, the CloudFormation script does the next within the agent account.
                  • Creates a Bedrock agent
                  • Creates an motion group
                  • Additionally creates a Lambda operate which is invoked by the Bedrock motion group
                  • Defines the OpenAPI schema
                  • Creates obligatory roles and permissions for the Bedrock agent
                  • Lastly, it prepares the Bedrock agent in order that it is able to take a look at.
    16. Examine for mannequin entry in Oregon (us-west-2)
            1. Confirm Nova Professional (us.amazon.nova-pro-v1:0) mannequin entry within the agent account. Navigate to the Amazon Bedrock console and select Mannequin entry below Configure and study. Seek for Mannequin title : Nova Professional to confirm entry. If not, then allow mannequin entry.
            2. Confirm entry to the meta.llama3-1-70b-instruct-v1:0 mannequin within the agent-kb account. This could already be enabled as we arrange the information base earlier.
    17. Run the agent. Go surfing to agent account. Navigate to Amazon Bedrock console and select Brokers below Construct.
    18. Select the title of the agent and select Take a look at. You may take a look at the next questions as talked about the workshop’s Stage 4: Test Natural Language Queries web page. For instance:
            1. Who’re the highest 5 clients in Saudi Arabia?
            2. Who’re the highest elements provider in the USA by quantity?
            3. What’s the complete income by area for the 12 months 1998?
            4. Which merchandise have the very best revenue margins?
            5. Present me orders with the very best precedence from the final quarter of 1997.

    19. Select Present hint to research the agent traces.

Some really helpful finest practices:

      • Phrase your query to be extra particular
      • Use terminology that matches your desk descriptions
      • Attempt questions much like your curated examples
      • Confirm your query pertains to information that exists within the TPCH dataset
      • Use Amazon Bedrock Guardrails so as to add configurable safeguards to questions and responses.

Clear up sources

It is suggested that you just clear up any sources you don’t want anymore to keep away from any pointless costs:

      1. Navigate to the CloudFormation console for the agent and agent-kb account, seek for the stack and and select Delete.
      2. S3 buckets have to be deleted individually.
      3. For deleting the roles and insurance policies created in each accounts, obtain the script delete-bedrock-agent-kb-roles-policies.sh from the aws-samples GitHub repository.
        1. Open Terminal in Mac or related bash shell on different platforms.
        2. Find and alter the listing to the downloaded location, present executable permissions:
        cd /my/location
        			chmod +x delete-bedrock-agent-kb-roles-policies.sh

      4. If you’re nonetheless not clear on the script utilization or inputs, then you possibly can run the script with the –assist choice then the script will show the utilization:
        ./ delete-bedrock-agent-kb-roles-policies.sh –assist
      5. Run the script: delete-bedrock-agent-kb-roles-policies.sh with the identical values for a similar enter parameters as in Step7 when operating the create_bedrock_agent_kb_roles_policies.sh script. Word: Enter the right account numbers for agent-account and agent-kb-account earlier than operating.
        ./delete-bedrock-agent-kb-roles-policies.sh --agent-profile agent  
          	--agent-kb-profile agent-kb  
        	  --lambda-role lambda_bedrock_kb_query_role  
        	  --kb-access-role bedrock_kb_access_role  
        	  --kb-access-policy bedrock_kb_access_policy  
        	  --lambda-policy lambda_bedrock_kb_query_policy  
        	  --agent-account 111122223333  
        	  --agent-kb-account 999999999999

        The script will ask for a affirmation, say sure and press enter.

Abstract

This resolution demonstrates how the Amazon Bedrock agent within the agent account can question the Amazon Bedrock information base within the agent-kb account.

Conclusion

This resolution makes use of Amazon Bedrock Information Bases for structured information to create a extra built-in strategy to cross-account information entry. The information base in agent-kb account connects on to Amazon Redshift Serverless in a personal VPC. The Amazon Bedrock agent within the agent account invokes an AWS Lambda operate as a part of its motion group to make a cross-account connection to retrieve response from the structured information base.

This structure presents a number of benefits:

      • Makes use of Amazon Bedrock Information Bases capabilities for structured information
      • Supplies a extra seamless integration between the agent and the info supply
      • Maintains correct safety boundaries between accounts
      • Reduces the complexity of direct database entry codes

As Amazon Bedrock continues to evolve, you possibly can make the most of future enhancements to information base performance whereas sustaining your multi-account structure.


Concerning the Authors

Author KunalKunal Ghosh is an professional in AWS applied sciences. He enthusiastic about constructing environment friendly and efficient options on AWS, particularly involving generative AI, analytics, information science, and machine studying. Moreover household time, he likes studying, swimming, biking, and watching motion pictures, and he’s a foodie.

Author ArghyaArghya Banerjee is a Sr. Options Architect at AWS within the San Francisco Bay Space, targeted on serving to clients undertake and use the AWS Cloud. He’s targeted on massive information, information lakes, streaming and batch analytics companies, and generative AI applied sciences.

Author IndranilIndranil Banerjee is a Sr. Options Architect at AWS within the San Francisco Bay Space, targeted on serving to clients within the hi-tech and semi-conductor sectors clear up complicated enterprise issues utilizing the AWS Cloud. His particular pursuits are within the areas of legacy modernization and migration, constructing analytics platforms and serving to clients undertake innovative applied sciences comparable to generative AI.

Author VinayakVinayak Datar is Sr. Options Supervisor based mostly in Bay Space, serving to enterprise clients speed up their AWS Cloud journey. He’s specializing in serving to clients to transform concepts from ideas to working prototypes to manufacturing utilizing AWS generative AI companies.

Leave a Reply

Your email address will not be published. Required fields are marked *