Construct generative AI–powered Salesforce functions with Amazon Bedrock


This submit is co-authored by Daryl Martis and Darvish Shadravan from Salesforce.

That is the fourth submit in a collection discussing the combination of Salesforce Data Cloud and Amazon SageMaker.

In Part 1 and Part 2, we present how Salesforce Knowledge Cloud and Einstein Studio integration with SageMaker permits companies to entry their Salesforce knowledge securely utilizing SageMaker’s instruments to construct, prepare, and deploy fashions to endpoints hosted on SageMaker. SageMaker endpoints will be registered with Salesforce Knowledge Cloud to activate predictions in Salesforce. In Part 3, we show how enterprise analysts and citizen knowledge scientists can create machine studying (ML) fashions, with out code, in Amazon SageMaker Canvas and deploy educated fashions for integration with Salesforce Einstein Studio to create highly effective enterprise functions.

On this submit, we present how native integrations between Salesforce and Amazon Web Services (AWS) allow you to Convey Your Personal Giant Language Fashions (BYO LLMs) out of your AWS account to energy generative synthetic intelligence (AI) functions in Salesforce. Requests and responses between Salesforce and Amazon Bedrock go by way of the Einstein Trust Layer, which promotes accountable AI use throughout Salesforce.

We show BYO LLM integration by utilizing Anthropic’s Claude mannequin on Amazon Bedrock to summarize a listing of open service circumstances and alternatives on an account document web page, as proven within the following determine.

Accomplice quote

“We proceed to increase on our sturdy collaboration with AWS with our BYO LLM integration with Amazon Bedrock, empowering our clients with extra mannequin decisions and permitting them to create AI-powered options and Copilots custom-made for his or her particular enterprise wants. Our open and versatile AI setting, grounded with buyer knowledge, positions us properly to be leaders in AI-driven options within the CRM house.”

–Kaushal Kurapati, Senior Vice President of Product for AI at Salesforce

Amazon Bedrock

Amazon Bedrock is a totally managed service that gives a alternative of high-performing basis fashions (FMs) from main AI corporations like AI21 Labs, Anthropic, Cohere, Meta, Mistral AI, Stability AI, and Amazon by way of a single API, together with a broad set of capabilities it’s essential construct generative AI functions with safety, privateness, and accountable AI. Utilizing Amazon Bedrock, you possibly can rapidly experiment with and consider prime FMs in your use case, privately customise them along with your knowledge utilizing methods akin to fine-tuning and Retrieval Augmented Technology (RAG), and construct brokers that execute duties utilizing your enterprise techniques and knowledge sources. Since Amazon Bedrock is serverless, you don’t need to handle infrastructure, and you may securely combine and deploy generative AI capabilities into your functions utilizing the AWS providers you’re already conversant in.

Salesforce Knowledge Cloud and Einstein Mannequin Builder

Salesforce Data Cloud is an information platform that unifies your organization’s knowledge, giving each staff a 360-degree view of the client to drive automation and analytics, personalize engagement, and energy trusted AI. Knowledge Cloud creates a holistic buyer view by turning volumes of disconnected knowledge right into a single, trusted mannequin that’s easy to entry and perceive. With knowledge harmonized inside Salesforce Knowledge Cloud, clients can put their knowledge to work to construct predictions and generative AI–powered enterprise processes throughout gross sales, help, and advertising and marketing.

With Einstein Model Builder, clients can construct their very own fashions utilizing Salesforce’s low-code mannequin builder expertise or combine their very own custom-built fashions into the Salesforce platform. Einstein Mannequin Builder’s BYO LLM expertise supplies the aptitude to register {custom} generative AI fashions from exterior environments akin to Amazon Bedrock and Salesforce Knowledge Cloud.

As soon as {custom} Amazon Bedrock fashions are registered in Einstein Mannequin Builder, fashions are linked by way of the Einstein Trust Layer, a sturdy set of options and guardrails that shield the privateness and safety of information, enhance the protection and accuracy of AI outcomes, and promote the accountable use of AI throughout Salesforce. Registered fashions can then be utilized in Prompt Builder, a newly launched, low-code immediate engineering device that enables Salesforce admins to construct, take a look at, and fine-tune trusted AI prompts that can be utilized throughout the Salesforce platform. These prompts will be built-in with Salesforce capabilities akin to Flows and Invocable Actions and Apex.

Resolution overview

With the Salesforce Einstein Mannequin Builder BYO LLM characteristic, you possibly can invoke Amazon Bedrock fashions in your AWS account. On the time of this writing, Salesforce helps Anthropic Claude 3 fashions on Amazon Bedrock for BYO LLM. For this submit, we use the Anthropic Claude 3 Sonnet mannequin. To be taught extra about inference with Claude 3, check with Anthropic Claude models within the Amazon Bedrock documentation.

In your implementation, it’s possible you’ll use the mannequin of your alternative. Confer with Bring Your Own Large Language Model in Einstein 1 Studio for fashions supported with Salesforce Einstein Mannequin Builder.

The next picture reveals a high-level structure of how one can combine the LLM out of your AWS account into the Salesforce Immediate Builder.

On this submit, we present tips on how to construct generative AI–powered Salesforce functions with Amazon Bedrock. The next are the high-level steps concerned:

  1. Grant Amazon Bedrock invoke mannequin permission to an AWS Identity and Access Management (IAM) consumer
  2. Register the Amazon Bedrock mannequin in Salesforce Einstein Mannequin Builder
  3. Combine the immediate template with the sector within the Lightning App Builder

Stipulations

Earlier than deploying this answer, be sure you meet the next stipulations:

  1. Have entry to Salesforce Knowledge Cloud and meet the requirements for using BYO LLM.
  2. Have Amazon Bedrock arrange. If that is the primary time you’re accessing Anthropic Claude fashions on Amazon Bedrock, it’s essential request entry. It is advisable to have sufficient permissions to request entry to fashions by way of the console. To request model access, sign up to the Amazon Bedrock console and choose Mannequin entry on the backside of the left navigation pane.

Resolution walkthrough

To construct generative AI–powered Salesforce functions with Amazon Bedrock, implement the next steps.

Grant Amazon Bedrock invoke mannequin permission to an IAM Consumer

Salesforce Einstein Studio requires an entry key and a secret to entry the Amazon Bedrock API. Comply with the directions to arrange an IAM user and access keys. The IAM consumer will need to have Amazon Bedrock invoke mannequin permission to entry the mannequin. Full the next steps:

  1. On the IAM console, choose Customers within the navigation panel. On the fitting facet of the console, select Add permissions and Create inline coverage.
  2. On the Specify permissions display, within the Service dropdown menu, choose Bedrock.
  3. Beneath Actions allowed, enter “invoke.” Beneath Learn, choose InvokeModel. Choose All beneath Assets. Select Subsequent.
  4. On the Evaluate and create display, beneath Coverage identify, enter BedrockInvokeModelPolicy. Select Create coverage.

Register Amazon Bedrock mannequin in Einstein Mannequin Builder

  1. On the Salesforce Knowledge Cloud console, beneath the Einstein Studio tab, select Add Basis Mannequin.
  2. Select Connect with Amazon Bedrock.
  3. For Endpoint info, enter the endpoint identify, your AWS account Entry Key, and your Secret Key. Enter the Area and Mannequin info. Select Join.
  4. Now, create the configuration for the mannequin endpoint you created within the earlier steps. Present Inference parameters akin to temperature to set the deterministic issue of the LLM. Enter a pattern immediate to confirm the response.
  5. Subsequent, it can save you this new mannequin configuration. Enter the identify for the saved LLM mannequin and select Create Mannequin.
  6. After the mannequin creation is profitable, select Shut and proceed to create the immediate template.
  7. Choose the Mannequin identify to open the Mannequin configuration.
  8. Choose Create Immediate Template to launch the immediate builder.
  9. Choose Discipline Technology because the immediate template kind, template identify, set Object to Account, and set Object Discipline to PB Case and Oppty Abstract. This may affiliate the template to a {custom} discipline within the account document object to summarize the circumstances.

For this demo, a wealthy textual content discipline named PB Case and Oppty Abstract was created and added to the Salesforce Account web page structure in accordance with the Add a Field Generation Prompt Template to a Lightning Record Page directions.

  1. Present the immediate and enter variables or objects for knowledge grounding and choose the mannequin. Confer with Prompt Builder to be taught extra.

Combine immediate template with the sector within the Lightning App builder

  1. On the Salesforce console, use the search bar to search out Lightning App Builder. Construct or edit an present web page to combine the immediate template with the sector as proven within the following screenshot. Confer with Add a Field Generation Prompt Template to a Lightning Record Page for detailed directions.
  2. Navigate to the Account web page and click on on the PB Case and Oppty Abstract enabled for chat completion to launch the Einstein generative AI assistant and summarize the account case knowledge.

Cleanup

Full the next steps to wash up your sources.

  1. Delete the IAM user
  2. Delete the foundation model in Einstein Studio

Amazon Bedrock presents on-demand inference pricing. There’s no extra prices with a continued mannequin subscription. To take away mannequin entry, check with the steps in Remove model access.

Conclusion

On this submit, we demonstrated tips on how to use your individual LLM in Amazon Bedrock to energy Salesforce functions. We used summarization of open service circumstances on an account object for example to showcase the implementation steps.

Amazon Bedrock is a totally managed service that makes high-performing FMs from main AI corporations and Amazon obtainable in your use by way of a unified API. You may select from a variety of FMs to search out the mannequin that’s greatest suited in your use case.

Salesforce Einstein Mannequin Builder allows you to register your Amazon Bedrock mannequin and use it in Immediate Builder to create prompts grounded in your knowledge. These prompts can then be built-in with Salesforce capabilities akin to Flows and Invocable Actions and Apex. You may then construct {custom} generative AI functions with Claude 3 which can be grounded within the Salesforce consumer expertise. Amazon Bedrock requests from Salesforce go by way of the Einstein Belief Layer, which supplies accountable AI use with options akin to dynamic grounding, zero knowledge retention, and toxicity detection whereas sustaining security and safety requirements.

AWS and Salesforce are excited for our mutual clients to harness this integration and construct generative AI–powered functions. To be taught extra and begin constructing, check with the next sources.


In regards to the Authors

Daryl Martis is the Director of Product for Einstein Studio at Salesforce Knowledge Cloud. He has over 10 years of expertise in planning, constructing, launching, and managing world-class options for enterprise clients, together with AI/ML and cloud options. He has beforehand labored within the monetary providers trade in New York Metropolis. Comply with him on LinkedIn.

Darvish Shadravan is a Director of Product Administration within the AI Cloud at Salesforce. He focuses on constructing AI/ML options for CRM, and is the product proprietor for the Convey Your Personal LLM characteristic. You may join with him on LinkedIn.

RachnaRachna Chadha is a Principal Options Architect AI/ML in Strategic Accounts at AWS. Rachna is an optimist who believes that moral and accountable use of AI can enhance society sooner or later and convey financial and social prosperity. In her spare time, Rachna likes spending time together with her household, climbing, and listening to music.

Ravi Bhattiprolu is a Sr. Accomplice Options Architect at AWS. Ravi works with strategic companions Salesforce and Tableau to ship revolutionary and well-architected merchandise and options that assist joint clients notice their enterprise goals.

Ife Stewart is a Principal Options Architect within the Strategic ISV section at AWS. She has been engaged with Salesforce Knowledge Cloud over the past 2 years to assist construct built-in buyer experiences throughout Salesforce and AWS. Ife has over 10 years of expertise in know-how. She is an advocate for range and inclusion within the know-how discipline.

Mike Patterson is a Senior Buyer Options Supervisor within the Strategic ISV section at AWS. He has partnered with Salesforce Knowledge Cloud to align enterprise goals with revolutionary AWS options to realize impactful buyer experiences. In Mike’s spare time, he enjoys spending time together with his household, sports activities, and out of doors actions.

Dharmendra Kumar Rai (DK Rai) is a Sr. Knowledge Architect, Knowledge Lake & AI/ML, serving strategic clients. He works carefully with clients to know how AWS might help them remedy issues, particularly within the AI/ML and analytics house. DK has a few years of expertise in constructing data-intensive options throughout a variety of trade verticals, together with high-tech, FinTech, insurance coverage, and consumer-facing functions.

Leave a Reply

Your email address will not be published. Required fields are marked *