Unlock AWS Price and Utilization insights with generative AI powered by Amazon Bedrock


Managing cloud prices and understanding useful resource utilization generally is a daunting job, particularly for organizations with advanced AWS deployments. AWS Cost and Usage Reports (AWS CUR) offers useful knowledge insights, however decoding and querying the uncooked knowledge will be difficult.

On this submit, we discover an answer that makes use of generative artificial intelligence (AI) to generate a SQL question from a person’s query in pure language. This resolution can simplify the method of querying CUR knowledge saved in an Amazon Athena database utilizing SQL question technology, working the question on Athena, and representing it on an internet portal for ease of understanding.

The answer makes use of Amazon Bedrock, a totally managed service that provides a alternative of high-performing basis fashions (FMs) from main AI firms like AI21 Labs, Anthropic, Cohere, Meta, Mistral AI, Stability AI, and Amazon by way of a single API, together with a broad set of capabilities to construct generative AI purposes with safety, privateness, and accountable AI.

Challenges addressed

The next challenges can hinder organizations from successfully analyzing their CUR knowledge, resulting in potential inefficiencies, overspending, and missed alternatives for cost-optimization. We intention to focus on and simplify them utilizing generative AI with Amazon Bedrock.

  • Complexity of SQL queries – Writing SQL queries to extract insights from CUR knowledge will be advanced, particularly for non-technical customers or these unfamiliar with the CUR knowledge construction (until you’re a seasoned database administrator)
  • Knowledge accessibility – To realize insights from structured knowledge in databases, customers have to get entry to databases, which generally is a potential menace to general knowledge safety
  • Person-friendliness – Conventional strategies of analyzing CUR knowledge typically lack a user-friendly interface, making it difficult for non-technical customers to reap the benefits of the precious insights hidden throughout the knowledge

Answer overview

The answer that we talk about is an internet utility (chatbot) that permits you to ask questions associated to your AWS prices and utilization in pure language. The appliance generates SQL queries primarily based on the person’s enter, runs them in opposition to an Athena database containing CUR knowledge, and presents the ends in a user-friendly format. The answer combines the ability of generative AI, SQL technology, database querying, and an intuitive internet interface to supply a seamless expertise for analyzing CUR knowledge.

The answer makes use of the next AWS companies:

 The next diagram illustrates the answer structure.

Figure 1. Architecture of Solution

Determine 1. Structure of Answer

The info move consists of the next steps:

  1. The CUR knowledge is saved in Amazon S3.
  2. Athena is configured to entry and question the CUR knowledge saved in Amazon S3.
  3. The person interacts with the Streamlit internet utility and submits a pure language query associated to AWS prices and utilization.
Figure 2. Shows the Chatbot Dashboard to ask question

Determine 2. Reveals the Chatbot Dashboard to ask query

  1. The Streamlit utility sends the person’s enter to Amazon Bedrock, and the LangChain utility facilitates the general orchestration.
  2. The LangChain code makes use of the BedrockChat class from LangChain to invoke the FM and work together with Amazon Bedrock to generate a SQL question primarily based on the person’s enter.
Figure 3. Shows initialization of SQL chain

Determine 3. Reveals initialization of SQL chain

  1. The generated SQL question is run in opposition to the Athena database utilizing the FM on Amazon Bedrock, which queries the CUR knowledge saved in Amazon S3.
  2. The question outcomes are returned to the LangChain utility.
Figure 4. Shows generated Query in the application output logs

Determine 4. Reveals generated Question within the utility output logs

  1. LangChain sends the SQL question and question outcomes again to the Streamlit utility.
  2. The Streamlit utility shows the SQL question and question outcomes to the person in a formatted and user-friendly method.
Figure 5. Shows final output presented on the chat bot webapp including SQL Query and the Query results

Determine 5. Reveals closing output offered on the chat bot webapp together with SQL Question and the Question outcomes

Conditions

To arrange this resolution, you must have the next conditions:

Configure the answer

Full the next steps to arrange the answer:

  1. Create an Athena database and table to store your CUR data. Make sure that the mandatory permissions and configurations are in place for Athena to entry the CUR knowledge saved in Amazon S3.
  2. Arrange your compute surroundings to name Amazon Bedrock APIs. Be sure to affiliate an IAM position with this surroundings that has IAM insurance policies that grant entry to Amazon Bedrock.
  3. When your occasion is up and working, set up the next libraries which might be used for working throughout the surroundings:
pip set up langchain==0.2.0 langchain-experimental==0.0.59 langchain-community==0.2.0 langchain-aws==0.1.4 pyathena==3.8.2 sqlalchemy==2.0.30 streamlit==1.34.0

  1. Use the next code to ascertain a connection to the Athena database utilizing the langchain library and the pyathena Configure the language mannequin to generate SQL queries primarily based on person enter utilizing Amazon Bedrock. It can save you this file as cur_lib.py.
from langchain_experimental.sql import SQLDatabaseChain
from langchain_community.utilities import SQLDatabase
from sqlalchemy import create_engine, URL
from langchain_aws import ChatBedrock as BedrockChat
from pyathena.sqlalchemy.relaxation import AthenaRestDialect

class CustomAthenaRestDialect(AthenaRestDialect):
    def import_dbapi(self):
        import pyathena
        return pyathena

# DB Variables
connathena = "athena.us-west-2.amazonaws.com"
portathena="443"
schemaathena="mycur"
s3stagingathena="s3://cur-data-test01/athena-query-result/"
wkgrpathena="major"
connection_string = f"awsathena+relaxation://@{connathena}:{portathena}/{schemaathena}?s3_staging_dir={s3stagingathena}/&work_group={wkgrpathena}"
url = URL.create("awsathena+relaxation", question={"s3_staging_dir": s3stagingathena, "work_group": wkgrpathena})
engine_athena = create_engine(url, dialect=CustomAthenaRestDialect(), echo=False)
db = SQLDatabase(engine_athena)

# Setup LLM
model_kwargs = {"temperature": 0, "top_k": 250, "top_p": 1, "stop_sequences": ["nnHuman:"]}
llm = BedrockChat(model_id="anthropic.claude-3-sonnet-20240229-v1:0", model_kwargs=model_kwargs)

# Create the immediate
QUERY = """
Create a syntactically appropriate athena question for AWS Price and Utilization report back to run on the my_c_u_r desk in mycur database primarily based on the query, then take a look at the outcomes of the question and return the reply as SQLResult like a human
{query}
"""
db_chain = SQLDatabaseChain.from_llm(llm, db, verbose=True)

def get_response(user_input):
    query = QUERY.format(query=user_input)
    consequence = db_chain.invoke(query)
    question = consequence["result"].break up("SQLQuery:")[1].strip()
    rows = db.run(question)
    return f"SQLQuery: {question}nSQLResult: {rows}"

  1. Create a Streamlit internet utility to supply a UI for interacting with the LangChain utility. Embody the enter fields for customers to enter their pure language questions and show the generated SQL queries and question outcomes. You possibly can title this file cur_app.py.
import streamlit as st
from cur_lib import get_response
import os

st.set_page_config(page_title="AWS Price and Utilization Chatbot", page_icon="chart_with_upwards_trend", format="centered", initial_sidebar_state="auto",
menu_items={
        'Get Assist': 'https://docs.aws.amazon.com/cur/newest/userguide/cur-create.html',
        #'Report a bug':,
        'About': "# The aim of this app is that will help you get higher understanding of your AWS Price and Utilization report!"
    })#HTML title
st.title("_:orange[Simplify] CUR data_ :sun shades:")

def format_result(consequence):
    components = consequence.break up("nSQLResult: ")
    if len(components) > 1:
        sql_query = components[0].change("SQLQuery: ", "")
        sql_result = components[1].strip("[]").break up("), (")
        formatted_result = []
        for row in sql_result:
            formatted_result.append(tuple(merchandise.strip("(),'") for merchandise in row.break up(", ")))
        return sql_query, formatted_result
    else:
        return consequence, []

def most important():
    # Get the present listing
    current_dir = os.path.dirname(os.path.abspath(__file__))
    st.markdown("<div class="most important">", unsafe_allow_html=True)
    st.title("AWS Price and Utilization chatbot")
    st.write("Ask a query about your AWS Price and Utilization Report:")

  1. Join the LangChain utility and Streamlit internet utility by calling the get_response Format and show the SQL question and consequence within the Streamlit internet utility. Append the next code with the previous utility code:
# Create a session state variable to retailer the chat historical past
    if "chat_history" not in st.session_state:
        st.session_state.chat_history = []

    user_input = st.text_input("You:", key="user_input")

    if user_input:
        attempt:
            consequence = get_response(user_input)
            sql_query, sql_result = format_result(consequence)
            st.code(sql_query, language="sql")
            if sql_result:
                st.write("SQLResult:")
                st.desk(sql_result)
            else:
                st.write(consequence)
            st.session_state.chat_history.append({"person": user_input, "bot": consequence})
            st.text_area("Dialog:", worth="n".be a part of([f"You: {chat['user']}nBot: {chat['bot']}" for chat in st.session_state.chat_history]), peak=300)
        besides Exception as e:
            st.error(str(e))

    st.markdown("</div>", unsafe_allow_html=True)

if __name__ == "__main__":
    most important()

  1. Deploy the Streamlit utility and LangChain utility to your internet hosting surroundings, comparable to Amazon EC2, or a Lambda operate.

Clear up

Until you invoke Amazon Bedrock with this resolution, you gained’t incur costs for it. To keep away from ongoing costs for Amazon S3 storage for saving the CUR reviews, you’ll be able to take away the CUR knowledge and S3 bucket. In the event you arrange the answer utilizing Amazon EC2, be sure to cease or delete the occasion while you’re performed.

Advantages

This resolution presents the next advantages:

  • Simplified knowledge evaluation – You possibly can analyze CUR knowledge utilizing pure language utilizing generative AI, eliminating the necessity for superior SQL data
  • Elevated accessibility – The net-based interface makes it environment friendly for non-technical customers to entry and acquire insights from CUR knowledge while not having credentials for the database
  • Time-saving – You possibly can shortly get solutions to your price and utilization questions with out manually writing advanced SQL queries
  • Enhanced visibility – The answer offers visibility into AWS prices and utilization, enabling higher cost-optimization and useful resource administration choices

Abstract

The AWS CUR chatbot resolution makes use of Anthropic Claude on Amazon Bedrock to generate SQL queries, database querying, and a user-friendly internet interface to simplify the evaluation of CUR knowledge. By permitting you to ask pure language questions, the answer removes boundaries and empowers each technical and non-technical customers to achieve useful insights into AWS prices and useful resource utilization. With this resolution, organizations could make extra knowledgeable choices, optimize their cloud spending, and enhance general useful resource utilization. We suggest that you just do due diligence whereas setting this up, particularly for manufacturing; you’ll be able to select different programming languages and frameworks to set it up in response to your desire and desires.

Amazon Bedrock allows you to construct highly effective generative AI purposes with ease. Speed up your journey by following the short begin information on GitHub and utilizing Amazon Bedrock Knowledge Bases to quickly develop cutting-edge Retrieval Augmented Technology (RAG) options or allow generative AI purposes to run multistep duties throughout firm methods and knowledge sources utilizing Amazon Bedrock Agents.


Concerning the Writer

Author ImageAnutosh is a Options Architect at AWS India. He likes to dive deep into his prospects’ use circumstances to assist them navigate by way of their journey on AWS. He enjoys constructing options within the cloud to assist prospects. He’s keen about migration and modernization, knowledge analytics, resilience, cybersecurity, and machine studying.

Leave a Reply

Your email address will not be published. Required fields are marked *