Thursday, October 10, 2024

Unlock AWS Value and Utilization insights with generative AI powered by Amazon Bedrock

Share


Managing cloud prices and understanding useful resource utilization is usually a daunting job, particularly for organizations with complicated AWS deployments. AWS Cost and Usage Reports (AWS CUR) supplies worthwhile knowledge insights, however deciphering and querying the uncooked knowledge could be difficult.

On this put up, we discover an answer that makes use of generative artificial intelligence (AI) to generate a SQL question from a person’s query in pure language. This answer can simplify the method of querying CUR knowledge saved in an Amazon Athena database utilizing SQL question era, operating the question on Athena, and representing it on an online portal for ease of understanding.

The answer makes use of Amazon Bedrock, a totally managed service that gives a selection of high-performing basis fashions (FMs) from main AI firms like AI21 Labs, Anthropic, Cohere, Meta, Mistral AI, Stability AI, and Amazon by a single API, together with a broad set of capabilities to construct generative AI functions with safety, privateness, and accountable AI.

Challenges addressed

The next challenges can hinder organizations from successfully analyzing their CUR knowledge, resulting in potential inefficiencies, overspending, and missed alternatives for cost-optimization. We goal to focus on and simplify them utilizing generative AI with Amazon Bedrock.

  • Complexity of SQL queries – Writing SQL queries to extract insights from CUR knowledge could be complicated, particularly for non-technical customers or these unfamiliar with the CUR knowledge construction (until you’re a seasoned database administrator)
  • Information accessibility – To achieve insights from structured knowledge in databases, customers must get entry to databases, which is usually a potential menace to general knowledge safety
  • Consumer-friendliness – Conventional strategies of analyzing CUR knowledge usually lack a user-friendly interface, making it difficult for non-technical customers to make the most of the dear insights hidden inside the knowledge

Answer overview

The answer that we focus on is an online software (chatbot) that means that you can ask questions associated to your AWS prices and utilization in pure language. The appliance generates SQL queries primarily based on the person’s enter, runs them towards an Athena database containing CUR knowledge, and presents the ends in a user-friendly format. The answer combines the ability of generative AI, SQL era, database querying, and an intuitive internet interface to offer a seamless expertise for analyzing CUR knowledge.

The answer makes use of the next AWS providers:

 The next diagram illustrates the answer structure.

Figure 1. Architecture of Solution

Determine 1. Structure of Answer

The info circulation consists of the next steps:

  1. The CUR knowledge is saved in Amazon S3.
  2. Athena is configured to entry and question the CUR knowledge saved in Amazon S3.
  3. The person interacts with the Streamlit internet software and submits a pure language query associated to AWS prices and utilization.
Figure 2. Shows the Chatbot Dashboard to ask question

Determine 2. Reveals the Chatbot Dashboard to ask query

  1. The Streamlit software sends the person’s enter to Amazon Bedrock, and the LangChain software facilitates the general orchestration.
  2. The LangChain code makes use of the BedrockChat class from LangChain to invoke the FM and work together with Amazon Bedrock to generate a SQL question primarily based on the person’s enter.
Figure 3. Shows initialization of SQL chain

Determine 3. Reveals initialization of SQL chain

  1. The generated SQL question is run towards the Athena database utilizing the FM on Amazon Bedrock, which queries the CUR knowledge saved in Amazon S3.
  2. The question outcomes are returned to the LangChain software.
Figure 4. Shows generated Query in the application output logs

Determine 4. Reveals generated Question within the software output logs

  1. LangChain sends the SQL question and question outcomes again to the Streamlit software.
  2. The Streamlit software shows the SQL question and question outcomes to the person in a formatted and user-friendly method.
Figure 5. Shows final output presented on the chat bot webapp including SQL Query and the Query results

Determine 5. Reveals remaining output offered on the chat bot webapp together with SQL Question and the Question outcomes

Conditions

To arrange this answer, you must have the next conditions:

Configure the answer

Full the next steps to arrange the answer:

  1. Create an Athena database and table to store your CUR data. Be certain that the mandatory permissions and configurations are in place for Athena to entry the CUR knowledge saved in Amazon S3.
  2. Arrange your compute surroundings to name Amazon Bedrock APIs. Be sure you affiliate an IAM function with this surroundings that has IAM insurance policies that grant entry to Amazon Bedrock.
  3. When your occasion is up and operating, set up the next libraries which might be used for working inside the surroundings:
pip set up langchain==0.2.0 langchain-experimental==0.0.59 langchain-community==0.2.0 langchain-aws==0.1.4 pyathena==3.8.2 sqlalchemy==2.0.30 streamlit==1.34.0

  1. Use the next code to determine a connection to the Athena database utilizing the langchain library and the pyathena Configure the language mannequin to generate SQL queries primarily based on person enter utilizing Amazon Bedrock. It can save you this file as cur_lib.py.
from langchain_experimental.sql import SQLDatabaseChain
from langchain_community.utilities import SQLDatabase
from sqlalchemy import create_engine, URL
from langchain_aws import ChatBedrock as BedrockChat
from pyathena.sqlalchemy.relaxation import AthenaRestDialect

class CustomAthenaRestDialect(AthenaRestDialect):
    def import_dbapi(self):
        import pyathena
        return pyathena

# DB Variables
connathena = "athena.us-west-2.amazonaws.com"
portathena="443"
schemaathena="mycur"
s3stagingathena="s3://cur-data-test01/athena-query-result/"
wkgrpathena="main"
connection_string = f"awsathena+relaxation://@{connathena}:{portathena}/{schemaathena}?s3_staging_dir={s3stagingathena}/&work_group={wkgrpathena}"
url = URL.create("awsathena+relaxation", question={"s3_staging_dir": s3stagingathena, "work_group": wkgrpathena})
engine_athena = create_engine(url, dialect=CustomAthenaRestDialect(), echo=False)
db = SQLDatabase(engine_athena)

# Setup LLM
model_kwargs = {"temperature": 0, "top_k": 250, "top_p": 1, "stop_sequences": ["nnHuman:"]}
llm = BedrockChat(model_id="anthropic.claude-3-sonnet-20240229-v1:0", model_kwargs=model_kwargs)

# Create the immediate
QUERY = """
Create a syntactically right athena question for AWS Value and Utilization report back to run on the my_c_u_r desk in mycur database primarily based on the query, then take a look at the outcomes of the question and return the reply as SQLResult like a human
{query}
"""
db_chain = SQLDatabaseChain.from_llm(llm, db, verbose=True)

def get_response(user_input):
    query = QUERY.format(query=user_input)
    end result = db_chain.invoke(query)
    question = end result["result"].cut up("SQLQuery:")[1].strip()
    rows = db.run(question)
    return f"SQLQuery: {question}nSQLResult: {rows}"

  1. Create a Streamlit internet software to offer a UI for interacting with the LangChain software. Embody the enter fields for customers to enter their pure language questions and show the generated SQL queries and question outcomes. You may title this file cur_app.py.
import streamlit as st
from cur_lib import get_response
import os

st.set_page_config(page_title="AWS Value and Utilization Chatbot", page_icon="chart_with_upwards_trend", structure="centered", initial_sidebar_state="auto",
menu_items={
        'Get Assist': 'https://docs.aws.amazon.com/cur/newest/userguide/cur-create.html',
        #'Report a bug':,
        'About': "# The aim of this app is that can assist you get higher understanding of your AWS Value and Utilization report!"
    })#HTML title
st.title("_:orange[Simplify] CUR data_ :sun shades:")

def format_result(end result):
    components = end result.cut up("nSQLResult: ")
    if len(components) > 1:
        sql_query = components[0].substitute("SQLQuery: ", "")
        sql_result = components[1].strip("[]").cut up("), (")
        formatted_result = []
        for row in sql_result:
            formatted_result.append(tuple(merchandise.strip("(),'") for merchandise in row.cut up(", ")))
        return sql_query, formatted_result
    else:
        return end result, []

def major():
    # Get the present listing
    current_dir = os.path.dirname(os.path.abspath(__file__))
    st.markdown("

", unsafe_allow_html=True) st.title("AWS Value and Utilization chatbot") st.write("Ask a query about your AWS Value and Utilization Report:")

  1. Join the LangChain software and Streamlit internet software by calling the get_response Format and show the SQL question and end result within the Streamlit internet software. Append the next code with the previous software code:
# Create a session state variable to retailer the chat historical past
    if "chat_history" not in st.session_state:
        st.session_state.chat_history = []

    user_input = st.text_input("You:", key="user_input")

    if user_input:
        attempt:
            end result = get_response(user_input)
            sql_query, sql_result = format_result(end result)
            st.code(sql_query, language="sql")
            if sql_result:
                st.write("SQLResult:")
                st.desk(sql_result)
            else:
                st.write(end result)
            st.session_state.chat_history.append({"person": user_input, "bot": end result})
            st.text_area("Dialog:", worth="n".be part of([f"You: {chat['user']}nBot: {chat['bot']}" for chat in st.session_state.chat_history]), top=300)
        besides Exception as e:
            st.error(str(e))

    st.markdown("

", unsafe_allow_html=True)

if __name__ == "__main__":
major()

  1. Deploy the Streamlit software and LangChain software to your internet hosting surroundings, resembling Amazon EC2, or a Lambda perform.

Clear up

Except you invoke Amazon Bedrock with this answer, you received’t incur costs for it. To keep away from ongoing costs for Amazon S3 storage for saving the CUR stories, you’ll be able to take away the CUR knowledge and S3 bucket. In the event you arrange the answer utilizing Amazon EC2, be sure you cease or delete the occasion whenever you’re achieved.

Advantages

This answer gives the next advantages:

  • Simplified knowledge evaluation – You may analyze CUR knowledge utilizing pure language utilizing generative AI, eliminating the necessity for superior SQL information
  • Elevated accessibility – The online-based interface makes it environment friendly for non-technical customers to entry and acquire insights from CUR knowledge without having credentials for the database
  • Time-saving – You may rapidly get solutions to your value and utilization questions with out manually writing complicated SQL queries
  • Enhanced visibility – The answer supplies visibility into AWS prices and utilization, enabling higher cost-optimization and useful resource administration selections

Abstract

The AWS CUR chatbot answer makes use of Anthropic Claude on Amazon Bedrock to generate SQL queries, database querying, and a user-friendly internet interface to simplify the evaluation of CUR knowledge. By permitting you to ask pure language questions, the answer removes obstacles and empowers each technical and non-technical customers to realize worthwhile insights into AWS prices and useful resource utilization. With this answer, organizations could make extra knowledgeable selections, optimize their cloud spending, and enhance general useful resource utilization. We suggest that you just do due diligence whereas setting this up, particularly for manufacturing; you’ll be able to select different programming languages and frameworks to set it up in keeping with your desire and wishes.

Amazon Bedrock allows you to construct highly effective generative AI functions with ease. Speed up your journey by following the short begin information on GitHub and utilizing Amazon Bedrock Knowledge Bases to quickly develop cutting-edge Retrieval Augmented Technology (RAG) options or allow generative AI functions to run multistep duties throughout firm techniques and knowledge sources utilizing Amazon Bedrock Agents.


Concerning the Creator

Author ImageAnutosh is a Options Architect at AWS India. He likes to dive deep into his clients’ use instances to assist them navigate by their journey on AWS. He enjoys constructing options within the cloud to assist clients. He’s enthusiastic about migration and modernization, knowledge analytics, resilience, cybersecurity, and machine studying.



Source link

Read more

Read More