Support Assistant Starter App

Content Type: Starter App
Categories: Starter Apps,AWS,Artificial Intelligence

Overview

 

Create your own AI-augmented support assistant with this starter app, which comes with all GenAI components required to seamlessly integrate advanced generative AI capabilities into your app. Kickstart your app and create a conversational support assistant which is grounded in private knowledge like static reference guides and can take advantage of live data in the app, like tickets.

 

This starter app demonstrates common generative AI patterns such as Retrieval-Augmented Generation (RAG), function calling, and knowledge base integration, utilizing both static data like reference guide documents, as well as live in-app ticket data. It serves as a practical example of how to leverage the GenAI capabilities of platform-supported modules in a unified use case, providing a solid foundation to explore and kickstart the implementation of your own smart app.

 

Key customization opportunities:

 

  • Conversation Starters: Tailor the suggested conversation starters to fit your specific use cases.
  • UI: Redesign the chat and overview pages to align with your needs and branding.
  • Custom Assistant Logic: Integrate custom pre- or post-processing logic into the action microflow that interacts with the Large Language Model (LLM) to refine and control the interaction between the assitant and the end-user.
  • Private Data Integration: Combine in-app and external data sources using Function Calling, RAG, and Knowledge Base interactions to provide contextually rich and more accurate responses.
  • System Prompt Adaptation and Function calling: Modify the system prompt and available functions to leverage in-app data and perform actions on behalf of the user within the chat.
  • Bring your own knowledge base: seamlessly integrate it into the support assistant logic.
  • LLM Provider Flexibility: Explore the LLM capabilities of Azure OpenAI and Amazon Bedrock or add your own LLM connector to utilize the AI model that best suits your requirements.

 

Explore the full potential of generative AI in your Mendix applications with this starter app template and start building today!

 

Documentation

Configuration

 

All modules that are needed to interact with an LLM from OpenAI or Bedrock as well are already installed. The app contains functionalities to configure OpenAI or Bedrock out of the box. Feel free to add your own models or remove the existing ones. To experience full functionality, hook up a PostgreSQL database as a knowledge base using the included PgVectorKnowledgeBase module.

 

The app contains a set of first time set-up screens that need configuration details to be entered by an admin user.

 

1. Configure the LLM

Make sure the encryption key is set and start the app.

  • To use Amazon Bedrock models,  configure your credentials (see AWS Authentication) before starting the application. Only the AWS region and whether to use static credentials can be selected at runtime (check out which models are available in which region, see AWS Model Support).
  • To use OpenAI's models, configure access to OpenAI or Azure OpenAI at runtime (see OpenAI Configuration for details) 

 

2. Configure the PgVector Knowledge Base

To experience the full set of the GenAI patterns that are currently possible in a Mendix app, a PgVector knowledge base is needed. The connection to your postgreSQL server needs to be configured at runtime, for more information, see PgVector Knowledge Base.

 

3. Configure the Support Assistant

Before users can chat with a model, the admin needs to create a Support Assistant Configuration for users to select in the chat interface.

 

  • Display name: for admin reference only.
  • Architecture & Model (chat): OpenAI or Bedrock and specific models to be used for the chat (from step 1)
  • Architecture & Model (embeddings): Required if using a knowledge base for similarity search and RAG purposes (from step 1)
  • Knowledge base: if applicable: the PgVector configuration that points to your external database that is to be used as knowledge base (from step 2).

 

4. Populate the Knowledge Base

This is a one-time manual initialization action that must be executed, if applicable, after the knowledge base has been configured in steps 2 and 3. That way the knowledge base is in sync with the Mendix app data and the support assistant can search through historical tickets and reference guides.

 

 

Additionally, you can create starting points for the chat conversations the end-users of the app will have. These conversation starters can be configured at runtime (or in the After Startup). Example included: 'I have an IT issue and I need a solution'.

 

 

Customize Support Assistant

This app serves as a starting point and there are many ways to customize your app:

 

  • Add your custom styling (see Customize Styling for more details)
  • Customize the conversation starters that are suggested in new chats
  • Redesign the overviews and chat page to your needs.
  • Add custom pre- or post-processing logic into the action microflow that interacts with the LLM
  • Leverage in-app and external data by combining Function calling, RAG and Knowledge base interaction.
  • Adapt the system prompt and add/change available functions to leverage in-app data and perform actions on behalf of the user using the chat.
  • Bring your own knowledge base.
  • Add your own LLM provider connector

Releases

Version: 1.1.0
Framework Version: 10.12.4
Release Notes: We have added a page for admin users to monitor the usage data. Token consumption data will be visible when LLM calls are executed using a compatible connector (currently the storage of token consumption data is already supported by OpenAI connector v3.3.0 and up). Furthermore, we have updated marketplace modules: - AWS Authentication Connector (3.1.3) - ConversationalUI (1.4.0) - Data widgets (2.23.0) - Feedback Module (3.0.0) - OpenAI connector (3.3.0) - GenAICommons (1.3.0)
Version: 1.0.0
Framework Version: 10.12.4
Release Notes: Initial release