Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Introduction

The Dataset Module is a versatile tool within the FrameworX platform, specifically designed for data exchange with SQL databases and text files from a diverse set of sources. This module shapes the foundational structure for data interaction, enabling other real-time modules on the platform to function effectively. It offers seamless compatibility with a wide array of database technologies, including ADO.NET, ODBC, OleDB, and native interfaces, providing straightforward configuration with prominent databases such as SQL Server, Oracle, SQLite, and PostgreSQL.

Designed with a focus on real-time applications, the Dataset Module incorporates a suite of essential features. These include multi-threaded concurrent connections for efficient data handling, a dedicated SQLite editor for streamlined database management, and an intuitive visual query builder that simplifies the process of query creation and manipulation. Moreover, the Dataset Module is equipped to integrate real-time tags within query strings, enhancing the dynamism of data interaction. It can also manage files and recipes in ASCII, Unicode, or XML formats, broadening its applicability across different data types. With these features, the Dataset Module offers an efficient, user-friendly interface for data exchange, thereby amplifying its functionality and enhancing the user experience.

On this page:

Table of Contents
maxLevel3


 

Purpose and Key Concepts

The purpose of the Dataset Module in FrameworX is to facilitate efficient data exchange between the platform and various external databases and text files, thereby creating a robust data infrastructure that empowers other real-time modules to function effectively.

Dataset DBs

In order for the Dataset Module to communicate with an external database, a connection must be created with certain parameters. These connections, which are created within the Datasets → DBs section of the module, are referred to as Dataset DBs.

Dataset Queries

In the context of the Dataset Module, a Dataset Query refers not only to an SQL query string, but also to a Project object that has a logical name, an SQL query related to that logical name, and other parameters defined within the Datasets → Queries section. There are many ways to automatically map the results of a query execution with Tags.

Dataset Tables

A Dataset Table is a logical name that is created within a project to set up access to a specific table in a connected database. These tables are listed within the Datasets → Tables section of the module. The Tags in the real-time database can easily be mapped to columns in the tables to perform insert, update, or read operations.

Dataset Files

A Dataset File is a logical name that defines parameters for reading and writing files in ASCII, Unicode, or XML formats.

Using Tags in the Dataset Module

In the Dataset Module, you can utilize Tags from your project as dynamic placeholders within SQL queries or file names. This capability facilitates real-time data manipulation within the query strings. Moreover, you can map the results of the queries to Tags, or employ Tags to populate values when inserting or modifying existing records. 


Understanding the Datasets Module

Overview

The Dataset Module serves as a robust and flexible framework for data interaction. It provides seamless connectivity with a variety of databases, facilitating the execution of SQL queries and manipulation of data to suit your application's requirements. The module executes data exchange through a combination of queries and/or stored procedures, which can be performed on-demand, scheduled, or triggered by specific project events.

Features Highlights

  • SQL Query Support: Enables efficient data extraction, manipulation, and transformation from a diverse set of sources.
  • Integration with External Data Sources: Provides seamless data collection and analysis from a wide range of external sources, including SQL databases and CSV files.
  • Visual Query Editor: Facilitates the creation and editing of advanced SQL queries, eliminating the need for extensive SQL knowledge.
  • Distributed Architecture: Supports concurrent requests and offers flexibility in processing data either on the client or server side. 
  • Stored Procedures Execution: Encourages advanced data manipulation and analysis within the platform.
  • Real-Time Execution:  Enables execution of SQL queries with real-time data, offering customization of parameters and timely analysis of generated data.

Data Utilization

The data retrieved from databases can be utilized in various ways throughout your project:

  • In the Displays Module: Visualization tools like DataGrids can present query results on screens and dashboards, creating custom views of the data that are accessible and easy to understand for your users.
  • In the Scripting Module: Custom scripts can reference query results and trigger specific actions, such as sending notifications, updating tags, or performing calculations, thereby implementing complex logic based on database data.
  • In the Alarm Module: Data can be used to configure or update alarm conditions, triggers, and actions, enabling the creation of dynamic alarm systems that respond to data changes.

In a similar fashion, various modules can add data to your SQL databases. For instance, the Displays Module can log operations inputs and commands, the Scripting Module can calculate analytics, and the Alarm and Historian Module can retain long-term records and audit trails. Essentially, the Dataset Module drives bi-directional real-time communication between all Project Modules and the SQL databases.

Processing Data Requests

Datasets Data Server Service

The Datasets Data Server Service forms an integral part of the Datasets Module, ensuring high performance and seamless integration while exchanging structured collections of data in real-time. It efficiently processes data requests from other modules, enabling connectivity with various components such as HMI/SCADA screens, scripting, and reporting tools.

Default Databases Used when Running the Project

The Dataset Module also serves as a data storage configuration hub for other modules. The Alarm and Historian modules, which generate events and records for long-term retention, utilize the Dataset Module for defining data storage. Similarly, the Security Module can utilize user definitions from an external database defined within the Dataset Module.

Data Source Virtualization

Data Source Virtualization is an advanced feature that simplifies data management across multiple data sources. It provides a unified interface for accessing, querying, and manipulating data, regardless of the underlying data storage technology. This feature allows modifications to the underlying data sources repositories without impacting the rest of the application.

Agnostic, Standards, Centralized Management

Adhering to industry standards, the module is designed to work seamlessly with various data storage technologies. It enables centralized management of data connections and configurations, streamlining the process of integrating different data sources into your projects.

For a more detailed explanation of how the Dataset Module works, please refer to the page Advanced Dataset Operation.


Configuring the Dataset Module

Learn how to connect to data sources, create queries, and optimize performance for efficient data management.

This section provides essential guidance for setting up and customizing the Dataset Module, including:


Anchor
Workflow
Workflow
Configuration Workflow

The typical configuration workflow for the Dataset Module has the following sequence:

Create the required database connections (DBs)

  1. Go to Datasets → DBs
  2. Collect the information to connect with the databases required for your project.
  3. Use the built-in SQLite database as a temporary development tool if one of your connected databases is not available yet.
  4. The virtualization model with logical names for queries and tables will make your project work directly with the new connection with the production database, without having to change anything on the Project Configuration other than that database connection.

Prepare the Queries the Project uses

  1. Go to Datasets → Queries
  2. Either use the Visual Query Editor or get the query string from IT or plant facilitator.
  3. Collect and create the logical names Dataset.Query to identify those queries.

Modify the Query to add real-time tags

  1. Go to Datasets → Queries
  2. Easily modify the query with the parameters that need to be connected with real-time values.
  3. For instance, a query that has the text "WHERE col1 = 5" can be modified to "WHERE col1 = {{tag.Test}}". The value of the Tag will be added to the proper position when the query is executed.

Prepare the Tables the Project uses

  1. Go to Datasets → Tables
  2. When you need to Insert or Modify data, you need to access the Database Table directly.
  3. In some cases, all the information you need is in one table, so there is no need to create a Query.
  4. You can easily connect the contents that are inserted in the table with Tags in the Project.

Configure the Stored Procedures

  1. Go to Datasets → Queries
  2. The Module Database can execute Stored Procedures; just define it using the same interface for the queries.

Configure data exchange with Files

  1. Go to Datasets → Files
  2. If necessary to exchange values of Tags with plain text or XML files, set that configuration.

Use your Dataset logical objects

  1. The logical object names created for Queries, Tables, and Files can be used in any part of the project.
  2. Examples: Script calculation, Display visualization, and others.



Anchor
Connections
Connections
Managing DB Connections

When using SQLite databases, the Module Dataset can automatically create the Database if necessary; for other ones, the Database itself must already exist before you set your connection. Users with any Permission groups can create new connections in the Project, but only the Administrator can configure databases password logins.

To create a new Database connection:

  1. Go to DatasetsDBs.

  2. Click Create New. The Create New Database Connection window displays.

  3. Enter or select information, as needed.

  4. Click OK. The database is added as a new row in the table.

  5. Edit the row fields to modify the required settings.

Dataset DB Configuration Properties 

Column

Description

Name

Enter a name for the database configuration. The system allows you to know if the name is not valid.

Provider

Identifies the Provider technology used in this connection 

Database

Identifies to which type of dataset is this connection

ConnectionString

Enter the information needed to connect with the database. The syntax is defined by the Database Provider used, it is a third party component, any additional parameter supported by the provider can be used normally.   

You use macros on the connection string too.  

Example: for the filename in a SQLite connection string, use <ProjectName> that is replaced by the name of the project.

LogonName

Enter a valid login name for the database.

LogonPassword

Enter the password that corresponds to the database login. (Only accessible by Administrators)

ServerIP

Optionally, an IP or DNS name for a computer to be used as a Secure Gateway.

Description

Enter a description for the database connection.


Please check the Connecting to SQL Server and Connecting to Excel for additional information.


There are four database connection already created in any new Project:

Datasets DB - Pre-defined database connections

DB

Database

Path Location

Usage

Retentive

SQLite

<ProjectNameAndPath>.dbRetentive

Stores values for Retentive Tags.

RuntimeUsers

SQLite

 <ProjectNameAndPath>.dbRuntimeUsers

Stores dynamically created Users.

AlarmHistorian

SQLite

 <ProjectNameAndPath>.dbAlarmHistorian

Stores Alarm and AuditTrail records.

TagHistorian

SQLite

<ProjectNameAndPath>.dbTagHistorian

Stores Tag Historian and Annotations.


Any of them can be customized to any type of database. 

The selection of best storage location depends on all kind of factors, from internal company procedures to the volume of data and how the data shall be used. Therefore, that is decision to each Project according to its requirements.

If needed to use another database for the pre-defined connections, execute the following steps:

  1. Rename or Delete the previous DB. This step is necessary, as the system would not allow to create two objects with the same name. 

  2. Crate a new DB with the same name of the previous DB, with the required Database and connection strings.

ConnectionString example for SQL Express 

  • Data Source: The server path and instance that will have the databases.
  • Initial Catalog: The name of the database that will be used.

Project Test Databases and Example

Assume you need to transition the AlarmHistorian configuration to a Microsoft SQL production database within your organization.

During the development and testing phases of the application, you may prefer not to publish alarm events to that database yet.

In other platforms, it would be necessary to manually switch connections between test and production environments or devise workarounds to handle this situation.

In our framework, we provide a built-in, optional feature: configure the AlarmHistorian DB to target the production database, regardless of its current availability or intended use. Subsequently, run the project in Test Mode, storing data in the local SQLite file <projectName>.dbAlarmHistorianTest.


When running the Project in Validation Mode, there is a configuration, which is true by default, that can override the connection of the pre-defined DB, using testing ones instead.

Those are database files that can be enabled to use when running in Validation Mode:

Database files used when running in Validation Mode

DB

Database

Path Location

Usage

Retentive

SQLite

<ProjectNameAndPath>.dbRetentiveTest

Stores values for Retentive Tags.

RuntimeUsers

SQLite

 <ProjectNameAndPath>.dbRuntimeUsersTest

Stores dynamically created Users.

AlarmHistorian

SQLite

 <ProjectNameAndPath>.dbAlarmHistorianTest

Stores Alarm and AuditTrail records.

TagHistorian

SQLite

<ProjectNameAndPath>.dbTagHistorianTest

Stores Tag Historian and Annotations.



Anchor
Datasets Queries Configuration
Datasets Queries Configuration
Dataset Queries Configuration

You can configure queries to perform more advanced functions with SQL statements to work with data from external databases.

To configure Dataset queries:

  1. Go to DatasetsQueries.
  2. Enter the field values as needed.

Dataset Query Configuration Properties  

Column

Description

Name

Enter a name for the query. The system allows you to know if the name is not valid.

DB

Select the database configuration.

SqlStatement

Enter the query using SQL syntax. You can embed tags directly on the SQL statement, embracing the tag with curtly brackets. E.g. "ColumnName = {tag.tagInt}"

Alternatively, you can also update the SQLStatement with the Script Module accessing the property Dataset.Query.QueryName.SqlStatement. When doing so, if you access that property from multiple tasks, be aware of potential concurrency issues, for advanced information on that refer to the page Advanced Dataset Operation

Mapping

Click "..." to select the tags that you want to populate with data from specific columns returned by the query.

MappingDateTime

Select the time reference (UTC or Local).

Description

Enter a description for the table configuration.


With the Visual Query Editor, users can drag and drop tables, define relationships, and add filters and conditions using a simple graphical interface. Once the query is created, it can be saved and executed like any other query within the Dataset Module.

Check the Visual SQL Query Builder page for complete information.


Anchor
Datasets Tables Configuration
Datasets Tables Configuration
Dataset Tables Configuration

To configure dataset tables:

  1. Go to DatasetsTables.
  2. Enter the field values as needed.

Dataset Table Configuration Properties  

Field / Column

Description

Name

Enter a name for the table configuration. The system lets you know if the name is not valid.

DB

Select the database connecton.

TableName

Select or type the table name in the Database you want to access

WhereCondition

Specify the parameters that will filter the data using SQL syntax. E.g. "ColumnName = {tag.tagInt}"

Access

Select the access permissions for the table.

Mapping

Click "..." to select the tags that you want to populate with data in the first row of the table with data from specific columns.

MappingDateTime

Select the time reference (UTC or Local).

Description

Enter a description for the table configuration.



Anchor
Datasets Files Configuration
Datasets Files Configuration
Dataset Files Configuration

To configure dataset files:

  1. Go to DatasetsFiles.

  2. Enter the field values as needed.

Dataset File Configuration Properties

Column

Description

Name

Enter a name for the file configuration. The system allows you to know if the name is not valid.

FileName

Enter the full path to the file. The file path can have Tag values embedded using curly brackets syntax. E.g.: ExampleFile{{tag.Test}}.txt. 

When executing, the area in curly brackets is replaced by the value of the Tag.

FileType

Select the type of file.

Objects

Click "..." to select the tags that you want to populate with data from the file with data from specific columns.

Description

Enter a description for the file configuration.

XMLSchemaType

Represents the schema type of an XML file, which can be: a TagList, XML that contains a tag list with the tag name and tag value; or a TagObject, XML that contains the entire tag tree and its children.




Working with the Dataset Module

Runtime Execution

One of the key features of the Dataset Module is the ability to execute SQL queries and retrieve data in real-time. Here are some ways to leverage the runtime execution features of the Dataset Module:

  • Create SQL queries to retrieve data from external databases.
  • Use query results to trigger events and actions within the platform environment.
  • Configure event triggers based on specific query criteria, such as changes to a specific data point or a threshold value being exceeded.


The Dataset Module can be easily integrated with other modules within the software environment. Here are some examples of how the Dataset Module can be used in conjunction with other modules:

  • Alarm Manager: Configure alarms based on query results to trigger notifications and actions.
  • Visualization: Display query results on screens and dashboards using DataGrids and other visualization tools.
  • Scripting: Use query results to trigger custom scripts and perform complex data processing and analysis.


By leveraging these integration options, users can gain greater insight and control over their data sources within the platform. With the ability to execute SQL queries and trigger actions based on query results, the Dataset Module provides a powerful set of tools for working with data.

Monitoring Databases Connections

Monitoring Database Connections is an essential aspect of maintaining a reliable and efficient system within the platform. By keeping track of database connections, you can ensure that your data is being accessed and updated correctly. Here are some ways to monitor database connections:


1. IsStarted: This property indicates if the dataset is started, meaning it has been initialized and connected to the database. You can use it to check if the dataset is currently running. 

Code Block
languagec#
titleExample
bool isStarted = @Dataset.DB.YourDatabaseName.IsStarted;

2. Query Execution Time: You can measure the execution time of a query by checking the time before and after executing the query. 

Code Block
languagec#
titleExample
DateTime startTime = DateTime.Now;
@Dataset.Query.YourQueryName.SelectCommand();
DateTime endTime = DateTime.Now;
TimeSpan executionTime = endTime - startTime;


3. ConnectionString: You can check the connection string used for the database to ensure it is configured correctly.

Code Block
languagec#
titleExample
string connectionString = @Dataset.DB.YourDatabaseName.ConnectionString;


Note

Please note that these are just examples, and you will need to replace YourDatabaseName and YourQueryName with the actual names of the database and query you are working with in the platform.

Showing DataGrids Tables on Displays

One of the key features of the Dataset Module is the ability to display query results on screens and dashboards using visualization tools like DataGrids. Here are some steps for using DataGrids to display query results:

  1. Create a query in the Dataset Module to retrieve the desired data.
  2. In the Displays Module, add a DataGrid control to the screen or dashboard.
  3. Configure the DataGrid to display the query results by selecting the data source and column mappings.
  4. Save and preview the screen or dashboard to display the query results on the DataGrid.

Using Query Results on Scripts and Tags

Users can use query results to trigger actions in custom scripts and tags. Here are some steps for using query results in scripts and tags:

  1. Create a query in the Dataset Module to retrieve the desired data.
  2. In the Scripting Module, create a custom script that references the query results.
  3. Use the query results to trigger specific actions within the script, such as sending notifications or updating tags.
  4. Save and execute the script to perform the desired actions.


Check the Using Stored Procedures page for additional information.



Troubleshooting and Best Practices


Table of Contents
maxLevel4
minLevel3
include#

Common #Issues and Solutions

Database Connection Problem

In the DB configuration, there is always a "Test" button to ensure that the connection is happening correctly. When there is a problem, the return of this button is an error message, usually returned by the database provider itself. The most common errors are: invalid user, invalid password, computer without access to the database, incorrect form of authentication.

Error accessing the Database Table

Once the connection is established, the Table configuration is specific to a table. In the "Table" combobox, the list of available tables automatically appears. It is possible, via script, to change which table will be accessed. However, care must be taken that the table exists and that the configuration is done using the correct name. The same care must be taken when Queries are used, as it is the user's responsibility to type the correct table name, as well as the syntax of the separators.

Error in the syntax of the Query

It is the user's responsibility to type the correct SQLStatement of a query using the QueryBuilder. Table name, column, values, all can generate an error if used incorrectly. For example: comparing different types may not return the expected result, strings in general should be in single quotes. The separators and clauses available can vary between databases, for example:

SQLServer
SELECT TOP 10 * FROM table WHERE column = value

SQLite
SELECT * FROM table WHERE column = value LIMIT 10;

Oracle
SELECT * FROM table WHERE column = value AND ROWNUM <= 10;

or new Oracle version
SELECT * FROM table WHERE column = value FETCH FIRST 10 ROWS ONLY;

IBM DB2
SELECT * FROM table WHERE column = value FETCH FIRST 10 ROWS ONLY;

ServerIP without TWebServer running on the remote machine

In some cases, the computer may not have access to the database. In this case, it is possible to create a gateway, routing the commands to be executed on the computer that has access to the database. The ServerIP field should be configured with the IP and port (<IP>:<port>), pointing to this computer that has access permission. This computer must have the software with the TWebServer running installed. It will automatically perform this gateway service and send the commands to the database.

DataTable returned NULL

When the return of a select or query is null, some error has occurred. Some common errors include: connection failure with the database, table not found, Dataset module is not running, incorrect query syntax. Check the return of the method using WithStatus when using a synchronous method or use the LastStatus and LastStatusMessage property when using asynchronous mode.

DataTable returned with 0 rows

When this happens, in general, there is a connection with the database and the table name is correct. In this case, the search condition is usually wrong, or the table is really empty. Check if the column names are correct and if the separators and clauses are valid.

Dataset Module is down

Although the TServer is responsible for forwarding requests to the database, the management and communication with the TServer is done by the Dataset module, as well as the treatment of responses. Therefore, if you are having basic problems with access and execution of database access, the first thing to check is whether the module is set up to run and is actually running.

Very high response time

Sometimes, it may seem that access to the database is not being made, but what might actually be happening is that some accesses can return a very large amount of data, or the database may be overloaded, or with a bad configuration, causing it to have a low performance. Or even, the network itself can be overloaded and slow, and all these factors can impact the response time. In these cases, it is important to execute the Query directly in the database environment to be sure that the problem is not on the side of the database. Do this and check how long the database itself takes to execute the query. It is also worth checking the volume of data exchanged to have an idea of ??the related side effects.

Update of a table with the wrong schema (select before update)

The Dataset module uses ADO technology, and many things are resolved at the level of this API. When we are going to perform an Update on a table, the schema of the table and controls in the .Net DataTable type are used. Therefore, if you are going to perform an update passing a Tag or .net DataTable object as a parameter, it is important that this object respects the schema of the destination Table in the database. Normally, a Select command must have been given at some point to obtain the correct schema used by the bank. After this, it is easy to add, remove, and modify values in this DataTable and update it back to the physical table in the database.

Where condition CaseSensitive

Case sensitivity in a WHERE clause depends on the database and the configuration of the database you are using. For example, in MySQL, queries are case-insensitive by default, which means 'abc' and 'ABC' would be considered equal. However, this can be changed with specific database settings. In SQL Server, case sensitivity is also determined by the database configuration. In PostgreSQL, queries are case-sensitive by default, so 'abc' and 'ABC' would be considered different. Therefore, it really depends on the specific database and the settings of that database. If you need to ensure case-insensitivity in a query, you can use functions like UPPER() or LOWER() to convert all values to upper or lower case before comparison. For example:

SELECT * FROM table WHERE LOWER(column) = LOWER(value);


This query will return records where the column matches the value, regardless of capitalization.

Best Practices and #Recommendations



Dataset Runtime Attributes

The Dataset namespace exposes properties and methods of the .NET objects used by the Dataset Module execution.

For more information on namespaces and objects, go to Objects and Attributes.

Dataset Module Properties examples

Property

Type

Description

Example

CursorIndex

Integer

Represents the current position of a cursor within the result set of a query. It is an integer value that indicates the row number in the result set that the cursor points to, with the first row having an index of 0. The cursor can be used to navigate through the result set, allowing you to read, update, or delete rows of data.



Dataset.Query.QueryName.CursorIndex + 1

OpenStatusMessage

String

Provides a human-readable message describing the current state of the database connection. This can be useful for monitoring the connection status, diagnosing issues, or displaying the connection status to users in a user-friendly format.


Dataset.DB.DatabaseName.OpenStatusMessage

ConnectionString

String

Used to store the necessary information for establishing a connection to a database. It is a string value containing details such as server name, database name, authentication credentials, and other relevant parameters.


Dataset.DB.ProviderName.ConnectionString

This section describes only some commonly used properties, for the full list properties and methods, go to the Namespaces Reference.



In this section...

Page Tree
root@self
spacesV10