Support Assistant Starter App
Overview
Create your own AI-augmented support assistant with this starter app, which comes with all GenAI components required to seamlessly integrate advanced generative AI capabilities into your app. Kickstart your app and create a conversational support assistant which is grounded in private knowledge like static reference guides and can take advantage of live data in the app, like tickets.
This starter app demonstrates common generative AI patterns such as Retrieval-Augmented Generation (RAG), function calling, and knowledge base integration, utilizing both static data like reference guide documents, as well as live in-app ticket data. It serves as a practical example of how to leverage the GenAI capabilities of platform-supported modules in a unified use case, providing a solid foundation to explore and kickstart the implementation of your own smart app.
Key customization opportunities:
- Conversation Starters: Tailor the suggested conversation starters to fit your specific use cases.
- UI: Redesign the chat and overview pages to align with your needs and branding.
- Custom Assistant Logic: Integrate custom pre- or post-processing logic into the action microflow that interacts with the Large Language Model (LLM) to refine and control the interaction between the assitant and the end-user.
- Private Data Integration: Combine in-app and external data sources using Function Calling, RAG, and Knowledge Base interactions to provide contextually rich and more accurate responses.
- System Prompt Adaptation and Function calling: Modify the system prompt and available functions to leverage in-app data and perform actions on behalf of the user within the chat.
- Bring your own knowledge base: seamlessly integrate it into the support assistant logic.
- LLM Provider Flexibility: Explore the LLM capabilities of Azure OpenAI and Amazon Bedrock or add your own LLM connector to utilize the AI model that best suits your requirements.
Explore the full potential of generative AI in your Mendix applications with this starter app template and start building today!
Documentation
Configuration
All modules that are needed to interact with an language model provided by Mendix Cloud, (Azure) OpenAI or Bedrock as well are already installed. The app contains functionalities to configure a Mendix Cloud GenAI Resource, (Azure) OpenAI and/or Amazon Bedrock deployed models out of the box.
Feel free to add your own models or remove the existing ones.
To experience full functionality using Mendix Cloud GenAI resources, you need to obtain and import keys for Mendix Cloud GenAI resources (Text Generation model and a Knowledge base).
For (Azure) OpenAI and Amazon Bedrock text generation models, you can hook up a PostgreSQL database as a knowledge base using the included PgVectorKnowledgeBase module.
The app contains a set of first time set-up screens that need configuration details to be entered by an admin user.
1a. Configure the LLM
Make sure the encryption key is set and start the app.
- To use Mendix Cloud GenAI you need to obtain and import keys for Mendix Cloud GenAI resources.
- To use Amazon Bedrock models, configure your credentials (see AWS Authentication) before starting the application. Only the AWS region and whether to use static credentials can be selected at runtime (check out which models are available in which region, see AWS Model Support).
- To use OpenAI's models, configure access to OpenAI or Azure OpenAI at runtime (see OpenAI Configuration for details)
1b. Configure the PgVector Knowledge Base
To experience the full set of the GenAI patterns that are currently possible in a Mendix app, connecting to a knowledge base is needed. If you use Mendix Cloud, this is part of 1a.
For (Azure OpenAI) the connection to your postgreSQL server needs to be configured at runtime, for more information, see PgVector Knowledge Base.
2. Configure the Support Assistant
Before users can chat with a model, the admin needs to create a Support Assistant Configuration for users to select in the chat interface.
- Display name: for admin reference only.
- Model (text): Select the language model you want to use for the assistant chat functionality, from the set you configured in step 1.
- Knowledge base*: Select Mendix Cloud if you have a Mendix Cloud Knowledge base resource. Use "Other (pgvector)" in other cases.
- Model (embeddings)*: Only applicable if you selected "Other (pgvector)". You need to select an embeddings model as configured in step 1.
- Knowledge base*: the external knowledge base that is to be used as knowledge base (configured in step 1a (Mendix Cloud) or 1b (Other)).
*optional, but recommended
3. Populate the Knowledge Base
This is a one-time manual initialization action that must be executed, if applicable, after the knowledge base has been configured in steps 2 and 3. That way the knowledge base is in sync with the Mendix app data and the support assistant can search through historical tickets and reference guides.
Additionally, you can create starting points for the chat conversations the end-users of the app will have. These conversation starters can be configured at runtime (or in the After Startup). Example included: 'I have an IT issue and I need a solution'.
Customize Support Assistant
This app serves as a starting point and there are many ways to customize your app:
- Add your custom styling (see Customize Styling for more details)
- Customize the conversation starters that are suggested in new chats
- Redesign the overviews and chat page to your needs.
- Add custom pre- or post-processing logic into the action microflow that interacts with the LLM
- Leverage in-app and external data by combining Function calling, RAG and Knowledge base interaction.
- Adapt the system prompt and add/change available functions to leverage in-app data and perform actions on behalf of the user using the chat.
- Bring your own knowledge base.
- Add your own LLM provider connector