Amazon Bedrock Connector

Content Type: Module
Categories: Connectors,AWS,Artificial Intelligence

Overview

Amazon Bedrock is a fully managed service that offers a choice of high-performing foundation models (FMs) from leading AI companies like AI21 Labs, Anthropic, Cohere, Meta, Stability AI, Mistral AI and Amazon via a single API, along with a broad set of capabilities you need to build generative AI applications, simplifying development while maintaining privacy and security.

The Amazon Bedrock Connector allows you to integrate with Amazon Bedrock.

As of version 3.0.0 the connector depends on the GenAICommons module. Since 4.0.0 it provides the following out-of-the box implementations:

  • ChatCompletions (without history) (Converse API)
  • ChatCompletions (with history) (Converse API)
  • Retrieve and Generate
  • Retrieve
  • Image generation (Amazon Titan)
  • Embeddings (Single String input) (Cohere Embed + Amazon Titan Embeddings v2)
  • Embeddings (Chunk collection) input (Cohere Embed + Amazon Titan Embeddings v2).

 

Thanks to these implementations, it is possible to leverage the large language models available via Amazon Bedrock in both the GenAI Showcase App as well as the AI Bot Starter app out of the box. Please have a look into the GenAICommons documentation to find more information about how to build the required request structure to call the operations. 

 

In addition, the connector provides the following actions:

  • List Foundation Models
  • Invoke Model (generic)
  • StartIngestionJob and GetIngestionJob (make it possible to sync changes to KnowledgeBases)
  • List Knowledge Bases
  • Invoke Agent
  • Get Agent
  • List Agents
  • Create / Delete / Get DataSource
  • List DataSources
  • List Ingestion Jobs

 

For more information about our AWS Connector strategy, please see the Mendix Evaluation Guide.

 

Attention:

For modeler version 9.24.2 until 9.24.29 remove the older version libraries manually from the userlib folder to avoid java compilation errors

 

Contact Us

For support and questions, feel free to reach out via email genai-components-feedback@mendix.com or Community Slack.

Documentation

Please see the documentation on Mendix Docs

Releases

Version: 6.3.0
Framework Version: 9.24.2
Release Notes: We added operations to manage Bedrock data sources from the connector, including: *CreateDataSource (S3 and Confluence) *DeleteDataSource *GetDataSource *ListDataSources *ListIngestionJobs
Version: 6.2.1
Framework Version: 9.24.2
Release Notes: Bug fix - we fixed an issue that showed consistency errors in the UI when importing the connector.
Version: 6.2.0
Framework Version: 9.24.2
Release Notes: We added admin capabilities to manage deployed models at runtime, so that admins can change existing models or add custom models. This makes it possible to support models that are available via cross-region inference or provisioned throughput.
Version: 6.1.0
Framework Version: 9.24.2
Release Notes: - Function microflows have been updated to accept “Tool” and/or “Request” as optional input parameters. - The function calling bug for microflows with no input parameters has been fixed. This ensures that microflows without input parameters now execute correctly, improving overall reliability and performance.
Version: 6.0.0
Framework Version: 9.24.2
Release Notes: The connector is made compatible with the simplified toolbox of the latest GenAI Commons release. Therefore, operations for Chat Completions, Embeddings and Image Generation were moved to GenAICommons operations. In order to use these, pass a DeployedModel instead of a Connection object. For existing implementation the following needs to be done: 1. Sync the new Deployed Models in the UI (use SNIP_Settings_Admin_MetaDataOverview on your own page) 2. In custom microflows, replace the Amazon Bedrock operations for chat completions, image generation and embeddings by the equivalent operations from GenAI Commons. These can be found in the Toolbox in Studio Pro under category “GenAI (Generate)”. 3. The new operations require a DeployedModel as input. In the case of Bedrock this needs to be of type BedrockDeployedModel. For more information see example microflows or the GenAI Showcase App.
Version: 5.4.0
Framework Version: 9.24.2
Release Notes: * Token monitoring is now available for Chat Completions actions and Embeddings
Version: 5.3.2
Framework Version: 9.24.2
Release Notes: Bug fix: The empty "Prompt Template" error
Version: 5.3.1
Framework Version: 9.24.2
Release Notes: -"Retrieve and Generate" action now supports ''Prompt Templates" - All the widgets have been upgraded
Version: 5.3.0
Framework Version: 9.24.2
Release Notes: - The Amazon Bedrock Connector has been updated to Mendix version 9.24.2. - The Bedrock SDK was updated to version 2.27.17 to be in line with the AWS Authentication 3.2.0 modules SDK version. - Some jar files where centralized to the AWS Authentication connector. - Improved handling of enumerations was introduced.
Version: 5.2.1
Framework Version: 9.24.2
Release Notes: - an issue was fixed for ChatCompletions_withoutHistory_AmazonBedrock where the wrong $Request input parameter was used