Skip to content

robertopc1/Redis_LLMmemory

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

39 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Use Azure Managed Redis to store LLM chat history

A streamlit-based web app writen in Python and using Azure Managed Redis and Azure OpenAI service to create a simple multi-user chatbot with chat memory.

Features

This project contains several features that build on a basic chatbot example. These features include:

  • Multiple users, each with their own chat memory and settings.
  • Running calculation of tokens contained in the stored chat memory
  • Ability to select the last n messages to trim the chat memory context
  • Ability to set a time-to-live (TTL) for chat memory, unique to each user
  • Configurable system instructions to change how the LLM responds to questions

Architecture

The example uses three Azure components:

  1. Azure App Service to host the application.
  2. Azure OpenAI Service to deploy an LLM (in this case, GPT-4o) to respond to user queries.
  3. Azure Managed Redis to store the chat history, set TTL, and hold other configurable system information.

The secret sauce of this example is using Redis, which is extremely flexible and ideal for these types of use-cases. We are also using the wonderful RedisVL library to make plugging Redis in easy. Azure Managed Redis is the newest managed Redis service on Azure and offers everything we need from Redis as a managed offering at a very attractive price.

Getting Started

Prerequisites

Deploy the project using the Azure Developer CLI

The Azure Developer CLI (AZD) is a super useful tool to provision resources and code on Azure with minimal fuss. Follow these instructions to get going:

  1. Clone the repository and change to the main project folder, where the azure.yaml file is located:
    git clone https://github.com/robertopc1/Redis_LLMmemory.git
    cd Redis_LLMmemory
    
  2. Run:
    azd up
    
  3. Follow command prompt to enter environment name and select subscription
  4. This will create all the resources needed to run the sample:
  • Azure App Services Web App
  • Azure OpenAI Service
  • Azure Managed Redis
  1. AZD will also deploy the code to the App Services instance. Once it has completed, open the Default domain (e.g. .azurewebsites.net) and the app should be running!

Cleaning Up Resources

To clean up the environment, run azd down

Guidance

Costs

Pricing varies per region and usage, so it isn't possible to predict exact costs for your usage. All of the Azure resources used in this infrastructure are on usage-based pricing tiers.

You can try the Azure pricing calculator for the resources:

  • Azure AI Services: S0 tier, gpt-4o. Pricing is based on token count. Pricing
  • Azure Managed Redis: B0 tier. Pricing
  • Azure App Service: B2 service plan - Linux. Pricing

⚠️ To avoid unnecessary costs, remember to take down your app if it's no longer in use, either by deleting the resource group in the Portal or running azd down.

Security guidelines

This template uses Managed Identities on all the services.

You may want to consider additional security measures, such as:

Resources

This template creates everything you uneed to get started with Azure Managed Redis as LLM Memory:

About

A streamlit-based web app writen in Python and using Azure Managed Redis and Azure OpenAI service to create a simple multi-user chatbot with chat memory

Topics

Resources

License

Code of conduct

Contributing

Security policy

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages

  • Python 56.8%
  • Bicep 37.9%
  • Dockerfile 4.1%
  • Shell 1.2%