Snowflake Mendix Data Loader

Content Type: Solution
Categories: Snowflake,Connectors


The Snowflake Mendix Data Loader is an application that is deployed in an organization's Snowflake environment. The Mendix Data Loader aids in the ingesting entity data that are exposed through OData into a Snowflake environment.

You have to be logged in to your Snowflake account to be able to view/download the listing.

To install the Mendix Data Loader, you need a Snowflake "Accountadmin" role privileges. 



The Mendix Data Loader is a technical component designed to facilitate the ingestion of data from operational Mendix applications via an exposed OData service into your Snowflake account. This enables organizations to leverage operational data for broader analytical and reporting purposes, enhancing business intelligence capabilities.


Usage Instructions


Once the Mendix Data Loader is deployed follow these steps to get started:


1. Access the README: Upon starting, the application will display this README file included in the application package.

2. Open the Application: Click the `MENDIX_DATA_LOADER` tab in the header to open the application interface.

3. Grant Privileges: In the application interface, a pop-up will appear requesting `CREATE DATABASE` privilege that is necessary for operating the application.

4. Configure the Mendix Data Loader:

   - This step is necessary for each initial ingestion from an endpoint. If it's not the first time ingesting from an endpoint, proceed to step 7.

    - Fill in the form fields with the required information:

        1. Endpoint: The base endpoint for the OData resource in your Mendix application, e.g.,

        2. Username: The username for the basic authentication into the OData resource in your Mendix application.

        3. Password: The password for the basic authentication into the OData resource in your Mendix application.

        4. Target database name: The name of the database where you would like to ingest data to.

        5. Target schema name: Specify the target schema name where all the data will be ingested into. Every time an ingestion is performed all data already present in the target schema will be removed/replaced.

5. Generate and Execute the Script:

    - Click the `Generate Script` button to produce the SQL script.

    - The script must be executed by a user with the following privileges: `CREATE SECRET`, `CREATE NETWORK RULE`, and `CREATE EXTERNAL ACCESS INTEGRATION`.

6. Ingest Data:After setting up, use the `Ingest Data` button to start the data transfer. All ingested data will be stored in transient tables:

7. View Ingested Data: The ingested data will be available in the schema that was specified inside the specified target database.


Note that the ingested data is stored in the target schema of the target database specified by the user and created by the Mendix Data Loader application. This target schema in the target database serves as a staging area and as such the user should copy the tables of the target schema in to a database and schema they want to use to store the ingested data. This should be done after every ingestion.

At present the Mendix Data Loader ingests all the data exposed by the OData service published by the Mendix application. This means that if you do not want to ingest all of the data you have to do the filtering of the data at the Mendix/OData side.




- Data Ingestion: Enables seamless data transfer from Mendix applications to Snowflake.


Current Limitations


1. At present the Mendix Data Loader supports username and password authentication so please make sure to use this setting when setting up your Odata service.

2. The recommended way (by Mendix) of exposing an association in an Odata service is as a link. This is not supported yet by the Mendix Data Loader. Please choose the "As an associated object id" option in your Odata settings. This option will store the associated object id in the table, but not explicitly as foreign key.

3. We support single endpoint (OData) ingestion. If you want to ingest data from multiple endpoint you can do this one by one. Make sure to assign a different staging schema for every ingestion you do or previous ingestions will be overwritten. Functionality to be able to ingest multiple endpoints in one go is on the roadmap.

4. At the moment we don't support scheduling of ingestion jobs as a feature of the Mendix Data Loader. This can already be done however using a snowflake worksheet. We are planning to make this possible from within the application in the future.



- For the best performance configure the exposed OData entities to have as little pagination as possible. At large enough amounts of data being ingested the Mendix Data Loader will run into an error because it is unable to parse the Json since it has become to large. When you run in to this problem you should introduce paging to the exposed OData entities. For the best performance make the pages as large as possible while still avoiding that the Json becomes to large to parse.

Note that in Mendix version 10.10.0 pagination of exposed entities via OData doesn't work. This is a known issue and will hopefully be solved soon.


For additional troubleshooting and support, contact our development team at




Interested in contributing to the Mendix Data Loader? Please reach out to our development team "" for more information on how you can contribute.




The licensing details for the Mendix Data Loader are documented under the Mendix EULA:


Contact Information


For support or any queries regarding the Mendix Data Loader, please contact our development team:


Version: 1.0.0
Framework Version: null
Release Notes: Initial Release