Improve Geospatial Evaluation and GIS Workflows with Amazon Bedrock Capabilities
As knowledge turns into extra ample and data techniques develop in complexity, stakeholders want options that reveal high quality insights. Making use of rising applied sciences to the geospatial area affords a singular alternative to create transformative person experiences and intuitive workstreams for customers and organizations to ship on their missions and obligations.
On this publish, we discover how one can combine present techniques with Amazon Bedrock to create new workflows to unlock efficiencies insights. This integration can profit technical, nontechnical, and management roles alike.
Introduction to geospatial knowledge
Geospatial knowledge is related to a place relative to Earth (latitude, longitude, altitude). Numerical and structured geospatial knowledge codecs might be categorized as follows:
- Vector knowledge – Geographical options, reminiscent of roads, buildings, or metropolis boundaries, represented as factors, traces, or polygons
- Raster knowledge – Geographical data, reminiscent of satellite tv for pc imagery, temperature, or elevation maps, represented as a grid of cells
- Tabular knowledge – Location-based knowledge, reminiscent of descriptions and metrics (common rainfall, inhabitants, possession), represented in a desk of rows and columns
Geospatial knowledge sources may additionally include pure language textual content parts for unstructured attributes and metadata for categorizing and describing the file in query. Geospatial Data Methods (GIS) present a option to retailer, analyze, and show geospatial data. In GIS functions, this data is often offered with a map to visualise streets, buildings, and vegetation.
LLMs and Amazon Bedrock
Massive language fashions (LLMs) are a subset of basis fashions (FMs) that may rework enter (normally textual content or picture, relying on mannequin modality) into outputs (usually textual content) by means of a course of referred to as technology. Amazon Bedrock is a complete, safe, and versatile service for constructing generative AI functions and brokers.
LLMs work in lots of generalized duties involving pure language. Some frequent LLM use circumstances embrace:
- Summarization – Use a mannequin to summarize textual content or a doc.
- Q&A – Use a mannequin to reply questions on knowledge or details from context supplied throughout coaching or inference utilizing Retrieval Augmented Era (RAG).
- Reasoning – Use a mannequin to offer chain of thought reasoning to help a human with decision-making and speculation analysis.
- Knowledge technology – Use a mannequin to generate artificial knowledge for testing simulations or hypothetical situations.
- Content material technology – Use a mannequin to draft a report from insights derived from an Amazon Bedrock information base or a person’s immediate.
- AI agent and gear orchestration – Use a mannequin to plan the invocation of different techniques and processes. After different techniques are invoked by an agent, the agent’s output can then be used as context for additional LLM technology.
GIS can implement these capabilities to create worth and enhance person experiences. Advantages can embrace:
- Stay decision-making – Taking real-time insights to help rapid decision-making, reminiscent of emergency response coordination and site visitors administration
- Analysis and evaluation – In-depth evaluation that people or techniques can determine, reminiscent of pattern evaluation, patterns and relationships, and environmental monitoring
- Planning – Utilizing analysis and evaluation for knowledgeable long-term decision-making, reminiscent of infrastructure growth, useful resource allocation, and environmental regulation
Augmenting GIS and workflows with LLM capabilities results in easier evaluation and exploration of information, discovery of latest insights, and improved decision-making. Amazon Bedrock supplies a option to host and invoke fashions in addition to combine the AI fashions with surrounding infrastructure, which we elaborate on on this publish.
Combining GIS and AI by means of RAG and agentic workflows
LLMs are skilled with giant quantities of generalized data to find patterns in how language is produced. To enhance the efficiency of LLMs for particular use circumstances, approaches reminiscent of RAG and agentic workflows have been created. Retrieving insurance policies and normal information for geospatial use circumstances might be achieved with RAG, whereas calculating and analyzing GIS knowledge would require an agentic workflow. On this part, we increase upon each RAG and agentic workflows within the context of geospatial use circumstances.
Retrieval Augmented Era
With RAG, you may dynamically inject contextual data from a information base throughout mannequin invocation.
RAG dietary supplements a user-provided immediate with knowledge sourced from a information base (assortment of paperwork). Amazon Bedrock affords managed information bases to knowledge sources, reminiscent of Amazon Simple Storage Service (Amazon S3) and SharePoint, so you may present supplemental data, reminiscent of metropolis growth plans, intelligence reviews, or insurance policies and laws, when your AI assistant is producing a response for a person.
Information bases are perfect for unstructured paperwork with data saved in pure language. When your AI mannequin responds to a person with data sourced from RAG, it might present references and citations to its supply materials. The next diagram exhibits how the techniques join collectively.

As a result of geospatial knowledge is commonly structured and in a GIS, you may join the GIS to the LLM utilizing instruments and brokers as an alternative of information bases.
Instruments and brokers (to regulate a UI and a system)
Many LLMs, reminiscent of Anthropic’s Claude on Amazon Bedrock, make it attainable to offer an outline of instruments accessible so your AI mannequin can generate textual content to invoke exterior processes. These processes would possibly retrieve dwell data, reminiscent of the present climate in a location or querying a structured knowledge retailer, or would possibly management exterior techniques, reminiscent of beginning a workflow or including layers to a map. Some frequent geospatial performance that you just would possibly need to combine together with your LLM utilizing instruments embrace:
- Performing mathematical calculations like the space between coordinates, filtering datasets primarily based on numeric values, or calculating derived fields
- Deriving data from predictive evaluation fashions
- Trying up factors of curiosity in structured knowledge shops
- Looking content material and metadata in unstructured knowledge shops
- Retrieving real-time geospatial knowledge, like site visitors, instructions, or estimated time to achieve a vacation spot
- Visualizing distances, factors of curiosity, or paths
- Submitting work outputs reminiscent of analytic reviews
- Beginning workflows, like ordering provides or adjusting provide chain
Instruments are sometimes carried out in AWS Lambda capabilities. Lambda runs code with out the complexity and overhead of operating servers. It handles the infrastructure administration, enabling quicker growth, improved efficiency, enhanced safety, and cost-efficiency.
Amazon Bedrock affords the characteristic Amazon Bedrock Agents to simplify the orchestration and integration together with your geospatial instruments. Amazon Bedrock brokers comply with directions for LLM reasoning to interrupt down a person immediate into smaller duties and carry out actions towards recognized duties from motion suppliers. The next diagram illustrates how Amazon Bedrock Brokers works.

The next diagram exhibits how Amazon Bedrock Brokers can improve GIS options.

Resolution overview
The next demonstration applies the ideas we’ve mentioned to an earthquake evaluation agent for instance. This instance deploys an Amazon Bedrock agent with a information base primarily based on Amazon Redshift. The Redshift occasion has two tables. One desk is for earthquakes, which incorporates date, magnitude, latitude, and longitude. The second desk holds the counites in California, described as polygon shapes. The geospatial capabilities of Amazon Redshift can relate these datasets to reply queries like which county had the newest earthquake or which county has had essentially the most earthquakes within the final 20 years. The Amazon Bedrock agent can generate these geospatially primarily based queries primarily based on pure language.
This script creates an end-to-end pipeline that performs the next steps:
- Processes geospatial knowledge.
- Units up cloud infrastructure.
- Masses and configures the spatial database.
- Creates an AI agent for spatial evaluation.
Within the following sections, we create this agent and check it out.
Stipulations
To implement this strategy, you could have an AWS account with the suitable AWS Identity and Access Management (IAM) permissions for Amazon Bedrock, Amazon Redshift, and Amazon S3.
Moreover, full the next steps to arrange the AWS Command Line Interface (AWS CLI):
- Verify you could have entry to the newest model of the AWS CLI.
- Sign in to the AWS CLI together with your credentials.
- Make sure that ./jq is put in. If not, use the next command:
Arrange error dealing with
Use the next code for the preliminary setup and error dealing with:
This code performs the next capabilities:
- Creates a timestamped log file
- Units up error trapping that captures line numbers
- Permits automated script termination on errors
- Implements detailed logging of failures
Validate the AWS atmosphere
Use the next code to validate the AWS atmosphere:
This code performs the important AWS setup verification:
- Checks AWS CLI set up
- Validates AWS credentials
- Retrieves account ID for useful resource naming
Arrange Amazon Redshift and Amazon Bedrock variables
Use the next code to create Amazon Redshift and Amazon Bedrock variables:
Create IAM roles for Amazon Redshift and Amazon S3
Use the next code to arrange IAM roles for Amazon S3 and Amazon Redshift:
Put together the information and Amazon S3
Use the next code to arrange the information and Amazon S3 storage:
This code units up knowledge storage and retrieval by means of the next steps:
- Creates a singular S3 bucket
- Downloads earthquake and county boundary knowledge
- Prepares for knowledge transformation
Remodel geospatial knowledge
Use the next code to rework the geospatial knowledge:
This code performs the next actions to transform the geospatial knowledge codecs:
- Transforms ESRI JSON to WKT format
- Processes county boundaries into CSV format
- Preserves spatial data for Amazon Redshift
Create a Redshift cluster
Use the next code to arrange the Redshift cluster:
This code performs the next capabilities:
- Units up a single-node cluster
- Configures networking and safety
- Waits for cluster availability
Create a database schema
Use the next code to create the database schema:
This code performs the next capabilities:
- Creates a counties desk with spatial knowledge
- Creates an earthquakes desk
- Configures applicable knowledge sorts
Create an Amazon Bedrock information base
Use the next code to create a information base:
This code performs the next capabilities:
- Creates an Amazon Bedrock information base
- Units up an Amazon Redshift knowledge supply
- Permits spatial queries
Create an Amazon Bedrock agent
Use the next code to create and configure an agent:
This code performs the next capabilities:
- Creates an Amazon Bedrock agent
- Associates the agent with the information base
- Configures the AI mannequin and directions
Check the answer
Let’s observe the system habits with the next pure language person inputs within the chat window.
Instance 1: Summarization and Q&A
For this instance, we use the immediate “Summarize which zones enable for constructing of an house.”
The LLM performs retrieval with a RAG strategy, then makes use of the retrieved residential code paperwork as context to reply the person’s question in pure language.

This instance demonstrates the LLM capabilities for hallucination mitigation, RAG, and summarization.
Instance 2: Generate a draft report
Subsequent, we enter the immediate “Write me a report on how varied zones and associated housing knowledge might be utilized to plan new housing growth to satisfy excessive demand.”
The LLM retrieves related city planning code paperwork, then summarizes the knowledge into an ordinary reporting format as described in its system immediate.

This instance demonstrates the LLM capabilities for immediate templates, RAG, and summarization.
Instance 3: Present locations on the map
For this instance, we use the immediate “Present me the low density properties on Abbeville road in Macgregor on the map with their tackle.”
The LLM creates a series of thought to search for which properties match the person’s question after which invokes the draw marker instrument on the map. The LLM supplies instrument invocation parameters in its scratchpad, awaits the completion of those instrument invocations, then responds in pure language with a bulleted checklist of markers positioned on the map.


This instance demonstrates the LLM capabilities for chain of thought reasoning, instrument use, retrieval techniques utilizing brokers, and UI management.
Instance 4: Use the UI as context
For this instance, we select a marker on a map and enter the immediate “Can I construct an house right here.”
The “right here” will not be contextualized from dialog historical past however relatively from the state of the map view. Having a state engine that may relay data from a frontend view to the LLM enter provides a richer context.
The LLM understands the context of “right here” primarily based on the chosen marker, performs retrieval to see the land growth coverage, and responds to the person in easy pure language, “No, and right here is why…”

This instance demonstrates the LLM capabilities for UI context, chain of thought reasoning, RAG, and gear use.
Instance 5: UI context and UI management
Subsequent, we select a marker on the map and enter the immediate “draw a .25 mile circle round right here so I can visualize strolling distance.”
The LLM invokes the draw circle instrument to create a layer on the map centered on the chosen marker, contextualized by “right here.”

This instance demonstrates the LLM capabilities for UI context, chain of thought reasoning, instrument use, and UI management.
Clear up
To scrub up your assets and forestall AWS costs from being incurred, full the next steps:
- Delete the Amazon Bedrock information base.
- Delete the Redshift cluster.
- Delete the S3 bucket.
Conclusion
The mixing of LLMs with GIS creates intuitive techniques that assist customers of various technical ranges carry out advanced spatial evaluation by means of pure language interactions. By utilizing RAG and agent-based workflows, organizations can keep knowledge accuracy whereas seamlessly connecting AI fashions to their present information bases and structured knowledge techniques. Amazon Bedrock facilitates this convergence of AI and GIS know-how by offering a sturdy platform for mannequin invocation, information retrieval, and system management, in the end reworking how customers visualize, analyze, and work together with geographical knowledge.
For additional exploration, Earth on AWS has movies and articles you may discover to grasp how AWS helps construct GIS functions on the cloud.
Concerning the Authors
Dave Horne is a Sr. Options Architect supporting Federal System Integrators at AWS. He’s primarily based in Washington, DC, and has 15 years of expertise constructing, modernizing, and integrating techniques for public sector prospects. Exterior of labor, Dave enjoys taking part in together with his children, mountaineering, and watching Penn State soccer!
Kai-Jia Yue is a options architect on the Worldwide Public Sector International Methods Integrator Structure staff at Amazon Internet Companies (AWS). She has a spotlight in knowledge analytics and serving to buyer organizations make data-driven selections. Exterior of labor, she loves spending time with family and friends and touring.
Brian Smitches is the Head of Accomplice Deployed Engineering at Windsurf specializing in how companions can carry organizational worth by means of the adoption of Agentic AI software program growth instruments like Windsurf and Devin. Brian has a background in Cloud Options Structure from his time at AWS, the place he labored within the AWS Federal Accomplice ecosystem. In his private time, Brian enjoys snowboarding, water sports activities, and touring with family and friends.