OpenAI Showcase App

Content Type: Sample
Categories: Starter Apps,Artificial Intelligence

Overview

OpenAI Showcase App

Showcase application that shows how to use the OpenAI ConnectorPgVector Knowledge Base, GenAI Commons and Conversational UI which can be used to integrate generative AI - the technology powering ChatGPT - into a Mendix app.

This project contains a variety of example use cases:

  • Interactive chatbot with history
  • Product description generation
  • Text complexity reduction
  • Text to JSON transformation
  • Demo data creation
  • Postcard (image) generation
  • Embedding vector generation
  • Retrieval Augmented Generation in a chatbot scenario
  • Clustering of unstructured text data
  • Semantic search (books, single-column)
  • Semantic search (tickets, single-column, multi-column and multi-language)
  • Function calling with chat completions
  • Vision with chat completions

 

How to get started

  1. Download the app package (.mpk file) from this page
  2. Open the app in Studio Pro (double-click the MPK to import it)
  3. Run the app & view it
  4. Log in
  5. Configure the connection with API credentials from an OpenAI account* or an Azure account.
  6. Try out the example use cases!

 

You can find technical documentation about the OpenAI ConnectorPgVector Knowledge Base, GenAI Commons and Conversational UI on MxDocs.

 

* If you have signed up for an OpenAI account and are using free trial credits, note that these are only valid for three months after the account has been created (not after the API key has been created). For more details, see the OpenAI API reference

Documentation

Use this app as an example of what you can do with the OpenAI Connector and how to use the connector in your own project.

 

The majority of the showcase examples in this app follow the principles from the GenAI Commons module. This means that these example use cases are automatically compatible with other GenAI connectors that follow these principles. In this app you can see how you can build vendor-agnostic GenAI use cases that integrate with operations from (for example) the OpenAI Connector as well as the Amazon Bedrock Connector at the same time.

 

You can find technical documentation about the OpenAI ConnectorPgVector Knowledge Base, GenAI Commons and Conversational UI on MxDocs.

Note: Each version of the Showcase app is only guaranteed to be compatible with the corresponding version of the OpenAI connector that is included in the app package. If you update the connector without updating the Showcase app, this may result in breaking changes in the example implementations. We therefore recommend updating with the newest version of the Showcase app instead of the OpenAI connector module.

Releases

Version: 3.1.0
Framework Version: 9.24.2
Release Notes: We have updated the genAI-related marketplace modules to the following versions - GenAI Commons 1.1.0 - OpenAI Connector 3.1.0 - AWS Authentication Connector 3.1.1 - AWS Bedrock Connector 4.0.0 - Conversational UI 1.2.0 - PgVector Knowledge base 2.0.0 The embeddings and image generation showcases are now compatible with GenAI Commons. As a result both OpenAI and AWS Bedrock models can be selected for the model interactions.
Version: 3.0.0
Framework Version: 9.24.2
Release Notes: - The OpenAI Connector has been updated to version 3.0.0 so that all chat showcase examples are based on the GenAI Commons module. - The Amazon Bedrock Connector has been imported and all chat completion examples can now be used with Anthropic Claude and Amazon Titan models. - All chat examples are now based on the Conversational UI module. - We have improved the UX of the homepage and some existing examples.
Version: 2.7.0
Framework Version: 9.24.0
Release Notes: The Showcase app has been updated to include the OpenAI Connector v2.7.0 and the PgVector KnowledgeBase v1.2.0. We added a data manipulation page to the semantic search ticket example. This illustrates how changes to Mendix objects influence their corresponding records in a knowledge base. The underlying microflows makes use of operations from the PgVector Knowledge Base. Furthermore, we created a new example demonstration chat completions with vision where users can either take a picture with their webcam or upload an image from their drive and ask questions about it. Lastly, we removed outdated OpenAI models from the models enumeration.
Version: 2.6.0
Framework Version: 9.24.0
Release Notes: The Showcase app has been updated to include the OpenAI Connector v2.6.0 and the PgVector KnowledgeBase v1.1.0. We've added an example to illustrate the new OpenAI Connector function calling capabilities, which enable the LLM to intelligently decide when to call a predefined function microflow to gather additional information to include in the assistant response. In this example, users can ask questions about tickets in the application database. Additionally, we have replaced parts of the implementations of the two similarity search examples by the new operations from the PgVector Knowledge Base. The association between the chunk objects and the Mendix objects for which the chunks were created is now set automatically when the chunks are retrieved.
Version: 2.5.0
Framework Version: 9.24.0
Release Notes: We have replaced all vector database interaction logic regarding storing and retrieval of knowledge base chunks by operations of the new PgVectorKnowledgeBase module. This means that no custom query coding is required anymore by developers. As a result the example queries have been removed from the pages. Also it is possible now to store connection details for multiple vector databases in the same Mendix app.
Version: 2.4.0
Framework Version: 9.24.0
Release Notes: We have enhanced the semantic search example by suggesting search terms to the user and adding tickets in different languages to the dataset. This shows that OpenAI's LLMs are capable of handling non-English language tasks. As of this release, we support the text-embedding-3-large model in all embeddings examples. This model is particularly suitable for tasks involving multiple languages. In addition, selected configurations are now saved in the database for each example and operation, in order to ensure consistency and reduce repeated setup effort.
Version: 2.3.0
Framework Version: 9.24.0
Release Notes: We created an additional example where we show semantic search through a dataset of IT helpdesk tickets. This example demonstrates how the OpenAI connector combined with vector database logic can be leveraged to perform semantic search on data records in an application. Two variations are demonstrated, one where a single column is embedded for the search, the other one uses a multi-column approach. Furthermore, the values of the OpenAI model enumerations have been updated. We added gpt-4-turbo-preview, gpt-4-1106-preview and text-embeddings-3-small and we removed legacy models gpt-3.5-turbo-16k and gpt-4-32k. Lastly, the database connector module has been updated to the newest released version.
Version: 2.2.0
Framework Version: 9.24.0
Release Notes: We created two additional examples demonstrating new use cases of vector embeddings. The first example shows how you can identify and visualize clusters within a textual dataset. The second use case implements a semantic search based on similarity of vector embeddings. Lastly, we made small improvements to the existing examples.
Version: 2.1.0
Framework Version: 9.24.0
Release Notes: We made a number of small improvements to clarify the working of the underlying interactions with the large language models. In the example implementations “Product description” and “Complexity reduction”, the user prompt and system prompt are now shown in the UI before the model is invoked. These prompts were also improved to protect against prompt injections. Furthermore, we have extended the chat implementation with optional JSON mode. It is now possible to instruct the model to respond with JSON in the chatbot-like setting, also taking into account the chat history.
Version: 2.0.0
Framework Version: 9.24.0
Release Notes: We added two example implementations based on the Embeddings API. In a smaller technical example, the Embeddings API is invoked for a list of String inputs and their embedding vectors are retrieved and shown in the UI. In a second, more complex example, an implementation of retrieval augmented generation (RAG) is demonstrated. For this implementation use case to work, you will need to provide an external PostgreSQL database that has the ‘pgvector’ extension installed. You’ll find detailed instructions on this in the UI of the running app.