Stay Assembly Assistant with Amazon Transcribe, Amazon Bedrock, and Information Bases for Amazon Bedrock


See CHANGELOG for up to date options and fixes.

You’ve probably skilled the problem of taking notes throughout a gathering whereas attempting to concentrate to the dialog. You’ve most likely additionally skilled the necessity to rapidly fact-check one thing that’s been mentioned, or search for data to reply a query that’s simply been requested within the name. Or possibly you may have a crew member that all the time joins conferences late, and expects you to ship them a fast abstract over chat to catch them up.

Then there are the instances that others are speaking in a language that’s not your first language, and also you’d like to have a dwell translation of what individuals are saying to ensure you perceive appropriately.

And after the decision is over, you normally need to seize a abstract to your information, or to ship to the contributors, with an inventory of all of the motion gadgets, homeowners, and due dates.

All of this, and extra, is now potential with our latest pattern answer, Stay Assembly Assistant (LMA).

Try the next demo to see the way it works.

On this put up, we present you tips on how to use LMA with Amazon Transcribe, Amazon Bedrock, and Knowledge Bases for Amazon Bedrock.

Answer overview

The LMA pattern answer captures speaker audio and metadata out of your browser-based assembly app (as of this writing, Zoom and Chime are supported), or audio solely from some other browser-based assembly app, softphone, or audio supply. It makes use of Amazon Transcribe for speech to textual content, Information Bases for Amazon Bedrock for contextual queries in opposition to your organization’s paperwork and information sources, and Amazon Bedrock fashions for customizable transcription insights and summaries.

Every thing you want is offered as open supply in our GitHub repo. It’s simple to deploy in your AWS account. Once you’re carried out, you’ll surprise the way you ever managed with out it!

The next are a number of the issues LMA can do:

  • Stay transcription with speaker attribution – LMA is powered by Amazon Transcribe ASR fashions for low-latency, high-accuracy speech to textual content. You may educate it model names and domain-specific terminology if wanted, utilizing customized vocabulary and customized language mannequin options in Amazon Transcribe.
  • Stay translation – It makes use of Amazon Translate to optionally present every phase of the dialog translated into your language of selection, from a collection of 75 languages.
  • Context-aware assembly assistant – It makes use of Information Bases for Amazon Bedrock to offer solutions out of your trusted sources, utilizing the dwell transcript as context for fact-checking and follow-up questions. To activate the assistant, simply say “Okay, Assistant,” select the ASK ASSISTANT! button, or enter your individual query within the UI.
  • On-demand summaries of the assembly – With the clicking of a button on the UI, you may generate a abstract, which is beneficial when somebody joins late and must get caught up. The summaries are generated from the transcript by Amazon Bedrock. LMA additionally offers choices for figuring out the present assembly subject, and for producing an inventory of motion gadgets with homeowners and due dates. You can too create your individual customized prompts and corresponding choices.
  • Automated abstract and insights – When the assembly has ended, LMA robotically runs a set of enormous language mannequin (LLM) prompts on Amazon Bedrock to summarize the assembly transcript and extract insights. You may customise these prompts as effectively.
  • Assembly recording – The audio is (optionally) saved for you, so you may replay vital sections on the assembly later.
  • Stock checklist of conferences – LMA retains monitor of all of your conferences in a searchable checklist.
  • Browser extension captures audio and assembly metadata from well-liked assembly apps – The browser extension captures assembly metadata—the assembly title and names of energetic audio system—and audio from you (your microphone) and others (from the assembly browser tab). As of this writing, LMA helps Chrome for the browser extension, and Zoom and Chime for assembly apps (with Groups and WebEx coming quickly). Standalone assembly apps don’t work with LMA —as an alternative, launch your conferences within the browser.

You might be liable for complying with authorized, company, and moral restrictions that apply to recording conferences and calls. Don’t use this answer to stream, report, or transcribe calls if in any other case prohibited.

Stipulations

It’s essential have an AWS account and an AWS Identity and Access Management (IAM) function and person with permissions to create and handle the mandatory assets and elements for this software. In case you don’t have an AWS account, see How do I create and activate a new Amazon Web Services account?

You additionally want an current information base in Amazon Bedrock. In case you haven’t set one up but, see Create a knowledge base. Populate your information base with content material to energy LMA’s context-aware assembly assistant.

Lastly, LMA makes use of Amazon Bedrock LLMs for its assembly summarization options. Earlier than continuing, if in case you have not beforehand carried out so, you should request access to the next Amazon Bedrock fashions:

  • Titan Embeddings G1 – Textual content
  • Anthropic: All Claude fashions

Deploy the answer utilizing AWS CloudFormation

We’ve offered pre-built AWS CloudFormation templates that deploy all the things you want in your AWS account.

In case you’re a developer and also you need to construct, deploy, or publish the answer from code, seek advice from the Developer README.

Full the next steps to launch the CloudFormation stack:

  1. Log in to the AWS Management Console.
  2. Select Launch Stack to your desired AWS Area to open the AWS CloudFormation console and create a brand new stack.
Area Launch Stack
US East (N. Virginia)
US West (Oregon)
  1. For Stack identify, use the default worth, LMA.
  2. For Admin Electronic mail Deal with, use a sound e-mail handle—your short-term password is emailed to this handle throughout the deployment.
  3. For Approved Account Electronic mail Area, use the area identify a part of your company e-mail handle to permit customers with e-mail addresses in the identical area to create their very own new UI accounts, or depart clean to forestall customers from instantly creating their very own accounts. You may enter a number of domains as a comma-separated checklist.
  4. For MeetingAssistService, select BEDROCK_KNOWLEDGE_BASE (the one obtainable choice as of this writing).
  5. For Assembly Help Bedrock Information Base Id (current), enter your current information base ID (for instance, JSXXXXX3D8). You may copy it from the Amazon Bedrock console.
  6. For all different parameters, use the default values.

If you wish to customise the settings later, for instance so as to add your individual AWS Lambda features, use customized vocabularies and language fashions to enhance accuracy, allow personally identifiable data (PII) redaction, and extra, you may replace the stack for these parameters.

  1. Choose the acknowledgement test containers, then select Create stack.

The primary CloudFormation stack makes use of nested stacks to create the next assets in your AWS account:

The stacks take about 35–40 minutes to deploy. The primary stack standing exhibits CREATE_COMPLETE when all the things is deployed.

Set your password

After you deploy the stack, open the LMA internet person interface and set your password by finishing the next steps:

  1. Open the e-mail you acquired, on the e-mail handle you offered, with the topic “Welcome to Stay Assembly Assistant!”
  2. Open your internet browser to the URL proven within the e-mail. You’re directed to the login web page.
  3. The e-mail comprises a generated short-term password that you simply use to log in and create your individual password. Your person identify is your e-mail handle.
  4. Set a brand new password.

Your new password should have a size of at the very least eight characters, and include uppercase and lowercase characters, plus numbers and particular characters.

  1. Observe the instructions to confirm your e-mail handle, or select Skip to do it later.

You’re now logged in to LMA.

You additionally acquired an identical e-mail with the topic “QnABot Signup Verification Code.” This e-mail comprises a generated short-term password that you simply use to log in and create your individual password within the QnABot designer. You utilize QnABot designer solely if you wish to customise LMA choices and prompts. Your username for QnABot is Admin. You may set your everlasting QnABot Admin password now, or preserve this e-mail protected in case you need to customise issues later.

Obtain and set up the Chrome browser extension

For one of the best assembly streaming expertise, set up the LMA browser plugin (at the moment obtainable for Chrome):

  1. Select Obtain Chrome Extension to obtain the browser extension .zip file (lma-chrome-extension.zip).
  2. Select (right-click) and broaden the .zip file (lma-chrome-extension.zip) to create an area folder named lma-chrome-extension.
  3. Open Chrome and enter the hyperlink chrome://extensions into the handle bar.
  4. Allow Developer mode.
  5. Select Load unpacked, navigate to the lma-chrome-extension folder (which you unzipped from the obtain), and select Choose. This masses your extension.
  6. Pin the brand new LMA extension to the browser device bar for straightforward entry—you’ll use it typically to stream your conferences!

Begin utilizing LMA

LMA offers two streaming choices:

  • Chrome browser extension – Use this to stream audio and speaker metadata out of your assembly browser app. It at the moment works with Zoom and Chime, however we hope so as to add extra assembly apps.
  • LMA Stream Audio tab – Use this to stream audio out of your microphone and any Chrome browser-based assembly app, softphone, or audio software.

We present you tips on how to use each choices within the following sections.

Use the Chrome browser extension to stream a Zoom name

Full the next steps to make use of the browser extension:

  1. Open the LMA extension and log in along with your LMA credentials.
  2. Be part of or begin a Zoom assembly in your internet browser (don’t use the separate Zoom shopper).

If you have already got the Zoom assembly web page loaded, reload it.

The LMA extension robotically detects that Zoom is working within the browser tab, and populates your identify and the assembly identify.

  1. Inform others on the decision that you’re about to start out recording the decision utilizing LMA and procure their permission. Don’t proceed if contributors object.
  2. Select Begin Listening.
  3. Learn and settle for the disclaimer, and select Enable to share the browser tab.

The LMA extension robotically detects and shows the energetic speaker on the decision. In case you are alone within the assembly, invite some pals to hitch, and observe that the names they used to hitch the decision are displayed within the extension once they communicate, and are attributed to their phrases within the LMA transcript.

  1. Select Open in LMA to see your dwell transcript in a brand new tab.
  2. Select your most popular transcript language, and work together with the assembly assistant utilizing the wake phrase “OK Assistant!” or the Assembly Help Bot pane.

The ASK ASSISTANT button asks the assembly assistant service (Amazon Bedrock information base) to recommend a great response primarily based on the transcript of the current interactions within the assembly. Your mileage might fluctuate, so experiment!

  1. If you find yourself carried out, select Cease Streaming to finish the assembly in LMA.

Inside a couple of seconds, the automated end-of-meeting summaries seem, and the audio recording turns into obtainable. You may proceed to make use of the bot after the decision has ended.

Use the LMA UI Stream Audio tab to stream out of your microphone and any browser-based audio software

The browser extension is probably the most handy solution to stream metadata and audio from supported assembly internet apps. Nonetheless, you may as well use LMA to stream simply the audio from any browser-based softphone, assembly app, or different audio supply taking part in in your Chrome browser, utilizing the handy Stream Audio tab that’s constructed into the LMA UI.

  1. Open any audio supply in a browser tab.

For instance, this might be a softphone (resembling Google Voice), one other assembly app, or for demo functions, you may merely play an area audio recording or a YouTube video in your browser to emulate one other assembly participant. In case you simply need to strive it, open the next YouTube video in a brand new tab.

  1. Within the LMA App UI, select Stream Audio (no extension) to open the Stream Audio tab.
  2. For Assembly ID, enter a gathering ID.
  3. For Identify, enter a reputation for your self (utilized to audio out of your microphone).
  4. For Participant Identify(s), enter the names of the contributors (utilized to the incoming audio supply).
  5. Select Begin Streaming.
  6. Select the browser tab you opened earlier, and select Enable to share.
  7. Select the LMA UI tab once more to view your new assembly ID listed, displaying the assembly as In Progress.
  8. Select the assembly ID to open the small print web page, and watch the transcript of the incoming audio, attributed to the participant names that you simply entered. In case you communicate, you’ll see the transcription of your individual voice.

Use the Stream Audio function to stream from any softphone app, assembly app, or some other streaming audio taking part in within the browser, together with your individual audio captured out of your chosen microphone. All the time receive permission from others earlier than recording them utilizing LMA, or some other recording software.

Processing circulation overview

How did LMA transcribe and analyze your assembly? Let’s have a look at the way it works. The next diagram exhibits the primary architectural elements and the way they match collectively at a excessive stage.

The LMA person joins a gathering of their browser, allows the LMA browser extension, and authenticates utilizing their LMA credentials. If the assembly app (for instance, Zoom.us) is supported by the LMA extension, the person’s identify, assembly identify, and energetic speaker names are robotically detected by the extension. If the assembly app shouldn’t be supported by the extension, then the LMA person can manually enter their identify and the assembly subject—energetic audio system’ names won’t be detected.

After getting permission from different contributors, the LMA person chooses Begin Listening on the LMA extension pane. A safe WebSocket connection is established to the preconfigured LMA stack WebSocket URL, and the person’s authentication token is validated. The LMA browser extension sends a START message to the WebSocket containing the assembly metadata (identify, subject, and so forth), and begins streaming two-channel audio from the person’s microphone and the incoming audio channel containing the voices of the opposite assembly contributors. The extension screens the assembly app to detect energetic speaker adjustments throughout the name, and sends that metadata to the WebSocket, enabling LMA to label speech segments with the speaker’s identify.

The WebSocket server working in Fargate consumes the real-time two-channel audio fragments from the incoming WebSocket stream. The audio is streamed to Amazon Transcribe, and the transcription outcomes are written in actual time to Kinesis Information Streams.

Every assembly processing session runs till the person chooses Cease Listening within the LMA extension pane, or ends the assembly and closes the tab. On the finish of the decision, the operate creates a stereo recording file in Amazon S3 (if recording was enabled when the stack was deployed).

A Lambda operate referred to as the Name Occasion Processor, fed by Kinesis Information Streams, processes and optionally enriches assembly metadata and transcription segments. The Name Occasion Processor integrates with the assembly help companies. LMA is powered by Amazon Lex, Information Bases for Amazon Bedrock, and Amazon Bedrock LLMs utilizing the open supply QnABot on AWS solution for solutions primarily based on FAQs and as an orchestrator for request routing to the suitable AI service. The Name Occasion Processor additionally invokes the Transcript Summarization Lambda operate when the decision ends, to generate a abstract of the decision from the total transcript.

The Name Occasion Processor operate interfaces with AWS AppSync to persist adjustments (mutations) in Amazon DynamoDB and ship real-time updates to the LMA person’s logged-in internet purchasers (conveniently opened by selecting the Open in LMA choice within the browser extension).

The LMA internet UI belongings are hosted on Amazon S3 and served through CloudFront. Authentication is offered by Amazon Cognito.

When the person is authenticated, the online software establishes a safe GraphQL connection to the AWS AppSync API, and subscribes to obtain real-time occasions resembling new calls and name standing adjustments for the conferences checklist web page, and new or up to date transcription segments and computed analytics for the assembly particulars web page. When translation is enabled, the online software additionally interacts securely with Amazon Translate to translate the assembly transcription into the chosen language.

The whole processing circulation, from ingested speech to dwell webpage updates, is occasion pushed, and the end-to-end latency is brief—usually just some seconds.

Monitoring and troubleshooting

AWS CloudFormation studies deployment failures and causes on the related stack’s Occasions tab. See Troubleshooting CloudFormation for assist with widespread deployment issues. Look out for deployment failures attributable to restrict exceeded errors; the LMA stacks create assets which are topic to default account and Area service quotas, resembling elastic IP addresses and NAT gateways. When troubleshooting CloudFormation stack failures, all the time navigate into any failed nested stacks to seek out the primary nested useful resource failure reported—that is virtually all the time the basis trigger.

Amazon Transcribe has a default restrict of 25 concurrent transcription streams, which limits LMA to 25 concurrent conferences in a given AWS account or Area. Request a rise for the variety of concurrent HTTP/2 streams for streaming transcription if in case you have many customers and must deal with a bigger variety of concurrent conferences in your account.

LMA offers runtime monitoring and logs for every element utilizing CloudWatch:

  • WebSocket processing and transcribing Fargate activity – On the Amazon Elastic Container Service (Amazon ECS) console, navigate to the Clusters web page and open the LMA-WEBSOCKETSTACK-xxxx-TranscribingCluster operate. Select the Duties tab and open the duty web page. Select Logs and View in CloudWatch to examine the WebSocket transcriber activity logs.
  • Name Occasion Processor Lambda operate – On the Lambda console, open the LMA-AISTACK-CallEventProcessor operate. Select the Monitor tab to see operate metrics. Select View logs in CloudWatch to examine operate logs.
  • AWS AppSync API – On the AWS AppSync console, open the CallAnalytics-LMA API. Select Monitoring within the navigation pane to see API metrics. Select View logs in CloudWatch to examine AWS AppSync API logs.

For QnABot on AWS for Assembly Help, seek advice from the Meeting Assist README, and the QnABot solution implementation guide for extra data.

Price evaluation

LMA offers a WebSocket server utilizing Fargate (2vCPU) and VPC networking assets costing about $0.10/hour (roughly $72/month). For extra particulars, see AWS Fargate Pricing.

LMA is enabled utilizing QnABot and Information Bases for Amazon Bedrock. You create your individual information base, which you utilize for LMA and probably different use instances. For extra particulars, see Amazon Bedrock Pricing. Further AWS companies utilized by the QnABot answer price about $0.77/hour. For extra particulars, seek advice from the checklist of QnABot on AWS solution costs.

The remaining answer prices are primarily based on utilization.

The utilization prices add as much as about $0.17 for a 5-minute name, though this may fluctuate primarily based on choices chosen (resembling translation), variety of LLM summarizations, and whole utilization as a result of utilization impacts Free Tier eligibility and quantity tiered pricing for a lot of companies. For extra details about the companies that incur utilization prices, see the next:

To discover LMA prices for your self, use AWS Cost Explorer or select Invoice Particulars on the AWS Billing Dashboard to see your month-to-date spend by service.

Customise your deployment

Use the next CloudFormation template parameters when creating or updating your stack to customise your LCA deployment:

  • To make use of your individual S3 bucket for assembly recordings, use Name Audio Recordings Bucket Identify and Audio File Prefix.
  • To redact PII from the transcriptions, set Allow Content material Redaction for Transcripts to true, and regulate Transcription PII Redaction Entity Sorts as wanted. For extra data, see Redacting or identifying PII in a real-time stream.
  • To enhance transcription accuracy for technical and domain-specific acronyms and jargon, set Transcription Customized Vocabulary Identify to the identify of a customized vocabulary that you simply already created in Amazon Transcribe or set Transcription Customized Language Mannequin Identify to the identify of a beforehand created customized language mannequin. For extra data, see Improving Transcription Accuracy.
  • To transcribe conferences in a supported language aside from US English, select the specified worth for Language for Transcription.
  • To customise transcript processing, optionally set Lambda Hook Perform ARN for Customized Transcript Phase Processing to the ARN of your individual Lambda operate. For extra data, see Using a Lambda function to optionally provide custom logic for transcript processing.
  • To customise the assembly help capabilities primarily based on the QnABot on AWS answer, Amazon Lex, Amazon Bedrock, and Information Bases for Amazon Bedrock integration, see the Meeting Assist README.
  • To customise transcript summarization by configuring LMA to name your individual Lambda operate, see Transcript Summarization LAMBDA option.
  • To customise transcript summarization by modifying the default prompts or including new ones, see Transcript Summarization.
  • To alter the retention interval, set Report Expiration In Days to the specified worth. All name knowledge is completely deleted from the LMA DynamoDB storage after this era. Adjustments to this setting apply solely to new calls acquired after the replace.

LMA is an open supply challenge. You may fork the LMA GitHub repository, improve the code, and ship us pull requests so we will incorporate and share your enhancements!

Replace an current LMA stack

You may replace your current LMA stack to the newest launch. For extra particulars, see Update an existing stack.

Clear up

Congratulations! You will have accomplished all of the steps for establishing your dwell name analytics pattern answer utilizing AWS companies.

Once you’re completed experimenting with this pattern answer, clear up your assets by utilizing the AWS CloudFormation console to delete the LMA stacks that you simply deployed. This deletes assets that have been created by deploying the answer. The recording S3 buckets, DynamoDB desk, and CloudWatch log teams are retained after the stack is deleted to keep away from deleting your knowledge.

Stay Name Analytics: Companion answer

Our companion answer, Live Call Analytics and Agent Assist (LCA), affords real-time transcription and analytics for contact facilities (telephone calls) somewhat than conferences. There are lots of similarities—the truth is, LMA was constructed utilizing an structure and lots of elements derived from LCA.

Conclusion

The Stay Assembly Assistant pattern answer affords a versatile, feature-rich, and customizable method to offer dwell assembly help to enhance your productiveness throughout and after conferences. It makes use of Amazon AI/ML companies like Amazon Transcribe, Amazon Lex, Information Bases for Amazon Bedrock, and Amazon Bedrock LLMs to transcribe and extract real-time insights out of your assembly audio.

The pattern LMA software is offered as open supply—use it as a place to begin to your personal answer, and assist us make it higher by contributing again fixes and options through GitHub pull requests. Browse to the LMA GitHub repository to discover the code, select Watch to be notified of latest releases, and test the README for the newest documentation updates.

For skilled help, AWS Professional Services and different AWS Partners are right here to assist.

We’d love to listen to from you. Tell us what you suppose within the feedback part, or use the problems discussion board within the LMA GitHub repository.


Concerning the authors

Bob Strahan Bob Strahan is a Principal Options Architect within the AWS Language AI Providers crew.

Chris Lott is a Principal Options Architect within the AWS AI Language Providers crew. He has 20 years of enterprise software program growth expertise. Chris lives in Sacramento, California and enjoys gardening, aerospace, and touring the world.

Babu Srinivasan is a Sr. Specialist SA – Language AI companies within the World Broad Specialist group at AWS, with over 24 years of expertise in IT and the final 6 years targeted on the AWS Cloud. He’s obsessed with AI/ML. Outdoors of labor, he enjoys woodworking and entertains family and friends (generally strangers) with sleight of hand card magic.

Kishore Dhamodaran is a Senior Options Architect at AWS.

Picture of Gillian ArmstrongGillian Armstrong is a Builder Options Architect. She is worked up about how the Cloud is opening up alternatives for extra individuals to make use of expertise to unravel issues, and particularly enthusiastic about how cognitive applied sciences, like conversational AI, are permitting us to work together with computer systems in additional human methods.

Leave a Reply

Your email address will not be published. Required fields are marked *