Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.


HTML
<style>
.text-span-6 {
    background-image: linear-gradient(99deg, rgba(170, 163, 239, .5), rgba(125, 203, 207, .5));
    border-radius: 50px;
    padding-left: 15px;
    padding-right: 15px;
}

#title-text {
display: none;
}

.panelgradient {
    background-image: linear-gradient(180deg, #d5def0, whitesmoke);
    border-radius: 8px;
    flex-direction: column;
    justify-content: center;
    align-items: center;
    padding: 4rem;
    display: flex;
    position: relative;
}

</style>


<div class ="panelgradient">

<h1 style="text-align: center;">Datasets <br> (Databases and SQL Queries)</h1>

</div>



Introduction to the Datasets Module

The Datasets Module is designed for data exchange with SQL databases and text files from a diverse set of sources. Essentially, the Dataset module drives bi-directional real-time communication between all Modules and the SQL databases. 

his module offers compatibility with various database technologies, including ADO.NET, ODBC, OleDB, and native interfaces, providing straightforward configuration with prominent databases such as SQL Server, Oracle, SQLite, and PostgreSQL. Features include:

  • Multi-threaded concurrent connections with multiple databases for efficient data handling,
  • SQL Query editor, SQLite admin tool, and a Visual QueryBuilder, streamlining the configuration experience.
  • Customization of the SQL statements in realtime with Tags and system events. 

On this page:

Table of Contents
maxLevel3
minLevel2
stylenone




Key Concepts and Terms


Panel
bgColor#ffffff

DatasetDBs

For the Dataset Module to communicate with an external database, a connection must be created with specific parameters. These connections, created within the Datasets → DBs section of the module, are referred to a DatasetDB.


Panel
bgColor#ffffff

DatasetQueries

In the context of the Dataset Module, a DatasetQuery refers not only to an SQL query string but also to a solution object that has a logical name, an SQL query related to that logical name, and other parameters defined within the Datasets → Queries section. The DatasetQuery allows to customize the SQL statements with tags evaluated in real time, as well to use its result in the real-time application.


Panel
bgColor#ffffff

DatasetTables

A DatasetTable is a logical name created within a solution to set up access to a specific table in a connected database. These tables are listed within the Datasets → Tables section of the module. The tags in the real-time database can be easily mapped to columns in the tables to perform insert, update, or read operations.


Panel
bgColor#ffffff

DatasetFiles

A Dataset File is a logical name that defines parameters for reading and writing files in ASCII, Unicode, or XML formats.





Understanding the Datasets Module

The Datasets Module enables users to interact with SQL databases seamlessly, allowing them to retrieve and manipulate data within their applications for:

  • Generate Custom Visualizations,
  • Implement Custom Logic,
  • Set Up Dynamic Data Processing, Alarming, or Storage operations

    .

    The module supports real-time Tags within query stringsSQL statements, and manages manages files and recipes in ASCII, Unicode, or XML formats.

    Data Utilization

    The data retrieved from databases can be utilized in various ways throughout your projectsolution, for example:

    • In the Displays Module: Visualization tools like DataGrids can present query results on screens and dashboards, creating custom views of the data that are accessible and easy to understand for your users.
    • In the Scripting Module: Custom scripts can reference query results and trigger specific actions, such as sending notifications, updating tags, or performing calculations, thereby implementing complex logic based on database data.
    • In the Alarm Module: Data can be used to configure or update alarm conditions, triggers, and actions, enabling the creation of dynamic alarm systems that respond to data changes.

    Various modules can also add data to your SQL databases. For instance, the Displays Module can log operation inputs and commands, the Scripting Module can calculate analytics, and the Alarm and Historian Modules can retain long-term records and audit trails. 

    Processing Data Requests

    Datasets Data Server Service

    The Datasets Data Server Service ensures high performance and seamless integration while exchanging structured collections of data in real-time. It efficiently processes data requests from other modules, enabling connectivity with various components such as HMI/SCADA screens, scripting, and reporting tools.

    Default Databases Used When Running the Solution

    The Dataset module also serves as a data storage configuration hub for other modules. The Alarm and Historian modules, which generate events and records for long-term retention, utilize the Dataset module for defining data storage. Similarly, the Security Module can utilize user definitions from an external database defined within the Dataset module.

    Data Source Virtualization

    Data Source Virtualization is an advanced feature that simplifies data management across multiple data sources. It provides a unified interface for accessing, querying, and manipulating data, regardless of the underlying data storage technology.

    • Devices:  Sending data from field equipments to a SQL database, or applying settings from the database to the field equipments.

    Pre-defined Databases

    The Dataset Module also serves as a data storage configuration hub for other modules. The following Database are pre-defined by the Dataset Module.

    • AlarmHistorian: events and records for long-term retention
    • TagHistorian: time-series storage for process variables 
    • RuntimeUsers: dynamics users and credentials created when running the solution 
    • Retentive: persist records for tags and properties that needs to be kept across multiple starts of the solution (typically configuration settings and setpoints)

    Processing Data Requests

    The Datasets Module has its implementation running as Service ensures high performance and for the reply in real-time from multiples clients requests.

    That architecture also allows an enhanced protecting and security to the database, and the client displays and scripts won't access the databases directly, but thought the Datasets Service. 

    Another benefit the ability to the Data Source Virtualization, meaning when the soliton is using Dataset.Query.Query1, in its displays or scripts, the database that is running that query, even the query itself, can be maintained or replaced, without affecting the overall solution configuration. This feature allows the solutions to work the data,  regardless of the underlying data storage technology.

    This feature allows modifications to the underlying data sources repositories without impacting the rest of the application. Keywords: Agnostic, Flexible, Easy Maintenance, Centralized Management.

    → Read more about For an advanced deeper understanding of the Datasets Services, see Dataset Advanced Topics.




    Configuring the Datasets Module

    Configuration Workflow

    The typical configuration workflow for the Dataset module has the following sequence:

    Datasets Module Configuration Workflow

    Action

    Where 

    Comments

    Define database connections

    Datasets→DBs

    Gather connection details for your applications databases and created DB objects as need. Leverage the built-in SQLite admin tool for temporary development purposes

    Prepare Queries

    Datasets→Queries

    DataExplorer→SQL

    VisualQueryBuilder

    Queries to craft queries using the built-in SQL Language Editor, the  VisualQueryBuilder or using provided SQL statements from other sources.

    Fine-tune queries adding real-time parameters. Eg.: Transform "WHERE col1 = 5" to "WHERE col1 = {{tag.Test}}".

    Map Database Tables

    Datasets→Tables

    Optionally, you can establish a direct mapping to tables within the Database. 

    Map Recipes and Text files

    Datasets→ Files

    Optionally, your solution may need to save or load recipes, or other information, from ASCII, Unicode, or XML files. 

    Managing DB Connections

    There are four database connections pre-defined in any new solution.

    Datasets DB - Pre-defined database connections

    DB

    Database

    Path Location

    Usage

    Retentive

    SQLite

    <ProjectNameAndPath>.dbRetentive

    Stores values for the Tags with the Retentive property set.

    RuntimeUsers

    SQLite

     <ProjectNameAndPath>.dbRuntimeUsers

    Stores dynamically created Solution SecurityUsers.

    AlarmHistorian

    SQLite

     <ProjectNameAndPath>.dbAlarmHistorian

    Stores Alarm and AuditTrail records.

    TagHistorian

    SQLite

    <ProjectNameAndPath>.dbTagHistorian

    Stores Tag Historian and Annotations.

    When using SQLite databases, the Dataset Module can automatically create the database locally if it doesn't already exist. For other database types, the database itself must already exist before you set your connection.

    → Read more about Dataset DBs.

    DatasetQueries Configuration

    Use the DatasetQueries to define SQL Statements, for queries and stored procedures to execute in connection of the created DatasetDB databases. 

    → Read more about DatasetsQueries

    DatasetTables Configuration

    Use the DatasetTables to access or exchange data with databases tables, with simplified query syntax. It allows allow insert new rows directly on database tables.

    → Read more about DatasetTables.

    DatasetFiles Configuration

    The DatasetFiles are used to customize file interactions in the Dataset Module. With this feature you can read or write realtime tags to ASCII, Unicode and XAML files.

    → Read more about DatasetFiles.



    Working with the Datasets module

    Runtime Execution

    When executing the solution, there is an in fracture of services allowing to manage the access to the database, and to transport that information to where it was requested. For instance, to show a DatasetQuery result in a HTML5 page, that request first goes to the Server, which request to the Database (that can be in another computer), and the information flows back to requests. 

    As the database operations can take some time to execute, it is very important to understand some aspects of the Dataset Module execution, including the concept of  synchronous vs asynchronous requests.

    The page Datasets Module Execution details concepts that important and describes the module internal operations.

    Showing DataGrids Tables on Displays

    One typical use of the Dataset Module is to display query results on displays and dashboards.

    In order to so, create or DatasetQuery, or DatasetTable, then use the DataGrid Control on your displays.

    Using Query Results on Scripts and Tags

    It's possible to define the SqlStatements with code (either using Scripts Module, or Display CodeBehind), as early connect the results with tags.

    The property Dataset.Query.Query1.SqlStatament holds the query that will be executed, just modify that property within your scripts. 

    The Tag Type DATATABLE was created to be compatible with results of Select() statements, just apply the results to your query, and use Tags no manage the information. 

    The TK (Toolkit extension for Scripts) has method that allow an easy copy in between DataTables (query results) and Template Tags, like TK.CopyTagToDataTable().

    Monitoring Databases Connection Status

    Monitoring Database Connections is an essential aspect of maintaining a reliable operation of the solution. 

    That can accomplish using the Dataset Namespace properties, which have status for DatasetTables and DatasetQueries operations. 

    For more information, go to Datasets Runtime Attributes.

    During the Development phase, when the Designer tools is connected with a Runtime (Solution in execution), there a monitoring page with the main status conditions. 

    See. page  Datasets Monitor




    Datasets Advanced Topics

    Datasets Module Execution

    The Dataset module facilitates efficient database interactions by utilizing TServer services, manageing synchronous and asynchronous executions for optimal performance.

    → Read more about Databases Connection And Interactions.

    Data Management

    The Dataset Module offers versatile methods for managing data and concurrency within solutions, including Data Table tags and Async Contents.

    → Read more about Data Management.

    Datasets Runtime Attributes

    The Datasets Namespace exposes properties and methods from the .NET objects used by the Historian Module execution. You can use these properties and methods on your Displays or to create Scripts and Alarms.

    → Read more about Dataset Runtime Attributes.

    Preventing SQL Injections

    → See page Datasets Advanced Topics

    Network Gateway Access And Time Zone Handling

    → See page Datasets Advanced Topics

    Backup Of Solutions SQLite Data Bases

    →See page Datasets Advanced Topics




    Best Practices and Troubleshooting

    Common Issues and Solutions

    Connection loss between project and database

    Database Timeout Configuration: The database may have a timeout setting that automatically disconnects idle connections after a certain period of time. It's recommended to check the database's timeout setting and adjust it, if necessary, to ensure that the connection remains active overnight.

    Power Settings: It's also suggested to check the computer's power settings to ensure that it's not entering sleep or hibernation mode during a certain idle moment, which could cause loss of connection to the database. Adjusting these settings to keep the computer active during these idle moments may resolve the issue.

    Database Connection Problem

    In the DB configuration, there is always a "Test" button to ensure that the connection is happening correctly. When there is a problem, the return of this button is an error message, usually returned by the database provider itself. The most common errors are: invalid user, invalid password, computer without access to the database, incorrect form of authentication.

    Issue: Error accessing the Database Table

    Once the connection is established, the Table configuration is specific to a table. In the "Table" combobox, the list of available tables automatically appears. It is possible, via script, to change which table will be accessed. However, care must be taken that the table exists and that the configuration is done using the correct name. The same care must be taken when Queries are used, as it is the user's responsibility to type the correct table name, as well as the syntax of the separators.

    Error in the Syntax of the Query

    It is the user's responsibility to type the correct SQLStatement of a query using the QueryBuilder. Table name, column, values, all can generate an error if used incorrectly. For example: comparing different types may not return the expected result, strings in general should be in single quotes. The separators and clauses available can vary between databases, for example:

    SQLServer

    Code Block
    languagesql
    titleQuery syntax
    SELECT TOP 10 * FROM table WHERE column = value

    SQLite

    Code Block
    languagesql
    titleQuery syntax
    SELECT * FROM table WHERE column = value LIMIT 10;

    Oracle

    Code Block
    languagesql
    titleQuery syntax
    SELECT * FROM table WHERE column = value AND ROWNUM <= 10;

    or new Oracle version

    Code Block
    languagesql
    titleQuery syntax
    SELECT * FROM table WHERE column = value FETCH FIRST 10 ROWS ONLY;

    IBM DB2

    Code Block
    languagesql
    titleQuery syntax
    SELECT * FROM table WHERE column = value FETCH FIRST 10 ROWS ONLY;


    ServerIP without TWebServer Running on the Remote Machine

    In some cases, the computer may not have access to the database. In this case, it is possible to create a gateway, routing the commands to be executed on the computer that has access to the database. The ServerIP field should be configured with the IP and port (<IP>:<port>), pointing to this computer that has access permission. This computer must have the software with the TWebServer running installed. It will automatically perform this gateway service and send the commands to the database.

    DataTable Returned NULL

    When a query is returned null, some error has occurred. Some common errors include: connection failure with the database, table not found, Dataset module is not running, incorrect query syntax. Check the return of the method using WithStatus when using a synchronous method or use the LastStatus and LastStatusMessage property when using asynchronous mode.

    DataTable Returned with 0 Rows

    When this happens, in general, there is a connection with the database and the table name is correct. In this case, the search condition is usually wrong, or the table is really empty. Check if the column names are correct and if the separators and clauses are valid.

    Dataset Module is Down

    Although the TServer is responsible for forwarding requests to the database, the management and communication with the TServer is done by the Dataset module, as well as the treatment of responses. Therefore, if you are having basic problems with access and execution of database access, the first thing to check is whether the module is set up to run and is actually running.

    Very High Response Time

    Sometimes, it may seem that access to the database is not being made, but what might actually be happening is that some accesses can return a very large amount of data, or the database may be overloaded, or with a bad configuration, causing it to have a low performance. Or even, the network itself can be overloaded and slow, and all these factors can impact the response time. In these cases, it is important to execute the Query directly in the database environment to be sure that the problem is not on the side of the database. Do this and check how long the database itself takes to execute the query. It is also worth checking the volume of data exchanged to have an idea of ??the related side effects.

    ** What are all these hashtags below for? **

    #Update of a table with the wrong schema (select before update)

    The Dataset module uses ADO technology, and many things are resolved at the level of this API. When we are going to perform an Update on a table, the schema of the table and controls in the .Net DataTable type are used. Therefore, if you are going to perform an update passing a Tag or .net DataTable object as a parameter, it is important that this object respects the schema of the destination Table in the database. Normally, a Select command must have been given at some point to obtain the correct schema used by the bank. After this, it is easy to add, remove, and modify values in this DataTable and update it back to the physical table in the database.

    #Where condition CaseSensitive

    Case sensitivity in a WHERE clause depends on the database and the configuration of the database you are using. For example, in MySQL, queries are case-insensitive by default, which means 'abc' and 'ABC' would be considered equal. However, this can be changed with specific database settings. In SQL Server, case sensitivity is also determined by the database configuration. In PostgreSQL, queries are case-sensitive by default, so 'abc' and 'ABC' would be considered different. Therefore, it really depends on the specific database and the settings of that database. If you need to ensure case-insensitivity in a query, you can use functions like UPPER() or LOWER() to convert all values to upper or lower case before comparison. For example:

    Code Block
    languagesql
    titleQuery syntax
    SELECT * FROM table WHERE LOWER(column) = LOWER(value);

    This query will return records where the column matches the value, regardless of capitalization.

    #Performance

    The Dataset module's performance depends on many factors, including database performance, network latency, and the complexity of executing SQL queries. The platform will minimize overhead and execute queries as efficiently as possible. However, ultimately, the performance of the Dataset module is tied to these external factors. It's essential to design your database schema and queries with performance in mind and consider investing in high-quality hardware and network infrastructure to ensure optimal performance.

    Best Practices and Recommendations

    #Error Handling

    Error handling in the Dataset module is straightforward. Suppose an error occurs during the execution of a command. The error message will update the module's Error property (Last Status). You can monitor this property to handle errors in your application. Furthermore, suppose an error occurs during the execution of a synchronous method. In that case, the process will return an empty Data Table, and update the Error property. Alternatively, you can call methods like SelectCommandWithStatus, where the status will be an output parameter in the method.

    These are some of the key points you should keep in mind while using the Dataset module. It's a powerful tool that provides much flexibility. However, it also requires careful use and understanding of its features and limitations. Always keep best practices in mind when designing and implementing your database interactions. Feel free to seek expert advice if you need clarification on anything.





    In this section:

    Page Tree
    root@self
    spacesV10

    ...