OpenAI Stream Chat
Overview
OpenAI Stream Chat Widget
A powerful custom widget that brings advanced OpenAI capabilities directly into your Mendix applications. This widget goes beyond basic LLM integrations by supporting both Chat Completions and Responses APIs with real-time streaming functionality.
Key Features:
- Dual API Support: Choose between OpenAI's Chat Completions API or the Responses API based on your needs
- Real-time Streaming: Watch AI responses appear character by character with built-in Server-Sent Events (SSE) support
- Flexible Configuration: Pass custom JSON request bodies to easily adjust parameters like reasoning-effort and response-format, without having to set up a domain model or import/export mappings.
- Developer-Friendly: Simple configuration through Mendix Studio Pro with clear parameter inputs for API key, system prompts, and request payloads
Perfect for:
- Building conversational AI interfaces in Mendix apps
- Creating intelligent assistants with streaming responses
- Implementing custom AI workflows that require advanced OpenAI parameters
- Developers who need flexibility beyond standard Mendix AI connectors
This widget solves the complexity of handling streaming responses and advanced API parameters in Mendix, providing a seamless bridge between your low-code platform and cutting-edge AI capabilities.
Documentation
Typical usage scenario
Add a chat interface to your Mendix application that connects directly to OpenAI's API with real-time streaming responses. The widget handles the complexity of Server-Sent Events and provides a ready-to-use chat UI.
Features
✅ Complete chat interface with message history
✅ Real-time streaming responses
✅ Support for Chat Completions and Responses APIs
✅ Configurable through JSON for API parameters
✅ Debug mode for troubleshooting
✅ Trigger Mendix actions on message events
Limitations
❌ Requires OpenAI API key (not included)
❌ No built-in conversation persistence (session-based only)
❌ No file upload/attachment support
❌ Single conversation thread per widget instance
Dependencies
- Mendix Studio Pro 10.x or higher
- OpenAI API key
- Internet connection
Installation
- Import the .mpk file into your project
- Find "OpenAI Chat" in the widget toolbox
- Drag onto your page
- Configure the required properties
Configuration
Required settings:
- API Key: Your OpenAI API key
- API Type: Choose "Chat Completions" or "Responses"
Optional settings:
- Request Configuration: JSON string with model and parameters (default: {"model": "gpt-4", "stream": true})
- System Prompt: Set the assistant's behavior
- Max Messages: Limit displayed messages (default: 50)
- Max/Min Height: Control widget size (e.g., "500px")
- Debug Mode: Log API calls to console
Example request configuration:
'{ "model": "o4-mini", "reasoning": { "effort": "high" }}'
For more information about OpenAI's API visit the docs: https://platform.openai.com/docs/