Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Overview

To

IfOverview

The Module Dataset is composed by the The Datasets Data Server Service that will receive the request from the various modules, access the data sources, and reply  to the requests. 

In order to fully leverage the Datasets Module Datasets is important essential to understating various advanced aspects such as: managing

  • Managing parallel requisitions
, doing
  • .
  • Doing synchronous or asynchronous requests
, mapping
  • .
  • Mapping data to
tags in the
  • a server
side
  • or
in the
  • client-side tags, the application, performance, and
others
  • other items. 

If you are new user, or in simple projects, it's not necessary to review all those concepts up front, before using the Dataset Module.  But, if you are deploying projects in production that required exchange of data with SQL database is recommended the you review those topics and how they may apply to specific solution architecture. On this page, you findadvanced topics to better understand these aspects of the Datasets Module.

On this page:

Table of Contents
maxLevel3
stylenone


Datasets Module Execution

Process

The Dataset module , specifically the "TRunModule.exe (Dataset)" process, reads all project configurations relevant to this module but does not directly interface with the databases. Database access services are provided by TServer, with the Dataset module consuming these services.

Connections with the Database

For each configured database (DB), there's only one access connection. This signifies a single execution thread per DB. However, for parallel execution of commands to the same database, additional DBs can be configured, all pointing to the same database.

Synchronous and Asynchronous Methods in Tables and Queries

The properties in Tables and Queries can trigger execution, such as triggering the "Select" property, which makes the execution asynchronous. This means that after the trigger in the property, the execution continues normally. In parallel, this property propagates from its trigger source (screen or script) to the server, then synchronizes with the Dataset module. The module then makes necessary calls to TServer's bank access service. After TServer's return with the Select execution for the Dataset module, the result gets updated in the module's properties and attributes, is processed for tag mapping, and is finally made available to the entire project.

Contrarily, available methods execute synchronously. For instance, calling the SelectCommand method locks the execution at this method while the Dataset module executes the call of the TServer service, interfaces with the database, and returns the result to the Dataset module. The Dataset module then processes the tag mapping and provides the result in the form of the method's return, allowing the continuation of the execution.

Special caution is required with synchronous calls within code-behind screens (mono-thread, WPF or HTML5 screens). Mechanisms like using Script task or Action Server Request can help create an asynchronous execution without needing to use attributes, when there's a need to handle the result even after an asynchronous execution.

Usage of Client-Type Tags in the Dataset Module's Tag Mapping

Client-type tags can be used in the Dataset module's tag mapping. During execution of a synchronous or asynchronous command, the assignment in the tags gets executed by the Dataset module (not by TServer) in the original domain of the call. Hence, if the call was executed on a client (screen or client script), after the result's return from TServer to the Dataset module, the mapping between received values and tags gets executed. Client domain Tags can be used in the mapping, which allows for specific control over the scope of availability of the value - either for the entire project (using server domain tags) or just for the client's scope (using client domain tags).

Attributes and Server Domain Properties (Concurrency Management)

All module attributes are server domain. Therefore, if multiple Tasks or clients are using the same attributes, there might be competition and confusion in execution. For example, if a client and a Task both set the SQLStatement of a query simultaneously with different values, the last value in the SQLStatement will be used in the operation.

Use of Tag DataTable

The return of Select methods is a .Net DataTable object. This can be manipulated directly in the script or assigned to a Tag of the DataTable type, allowing the result to propagate to other modules like clients or reports.

Dataset Module Properties

There are several server domain properties within the Dataset module, including a .Net DataTable type property called AsyncContents. When the Select property is triggered, initiating an asynchronous execution, the column values of the first row are assigned to the tags configured in the Mapping

facilitates efficient database interactions by utilizing TServer services. While it doesn't directly interact with databases, it manages synchronous and asynchronous executions for optimal performance. Understand the differences between these methods by reading more on the Datasets Module Execution page.


Data Management

The Dataset Module offers versatile methods for managing data and concurrency within solutions, including Data Table tags and Async Contents. Understand how to avoid conflicts and optimize performance, reading more on the Data Management page.


Datasets Runtime Attributes

The Datasets Namespace exposes properties and methods from the .NET objects used by the Historian Module execution. You can use these properties and methods on your Displays or to create Scripts and Alarms. The Dataset Runtime Attributes page lists all options available.


Preventing SQL Injections

When calling Stored Procedures in the database, there's a significant concern about SQL injection, especially when passing parameters as plain text in SQL statements. To mitigate this, we utilize the .NET API, where parameters are added to a list, effectively preventing code injection. Therefore, we advise using parameters in Stored Procedure calls with the following syntax:

Code Block
execute <StoredProcedureName> <@paramName1>={Tag.Tag1}, <@paramName2>="{Tag.TagStr}", <@paramName3>={Tag.OutValue} OUTPUT, ... , @RETURN_VALUE={Tag.RetValue} RETURNVALUE

This approach parameterizes Stored Procedure calls and guards against SQL injection attacks. Each parameter value is strictly interpreted as a value and not part of the SQL command, providing a critical layer of security against unauthorized database manipulation.


Network Gateway Access And Time Zone Handling

In some scenarios, specific databases might not be accessible from all computers due to network topology or security constraints. The ServerIP column can redirect database access commands to a machine with the necessary permissions, provided you install our platform on that machine. Only TWebServer would be active in such cases, handling database interaction, receiving commands, and establishing connections. This method offers a workaround for database access restricted to specific machines.

Handling Time Zones

The platform treats all Date and Time type tags as UTC. Consequently, the dates written to and read from the standard database, such as historical data and alarms, are also in UTC. If it is necessary to read and write dates in the local timezone, account for the time difference when preparing the SQL statement or using the WhereCondition property. Other external data tables, from third parties or the client, can use any timezone. To assist in handling this data, you can configure the DateTimeMode column with the type of timezone used in the database (UTC or Local Time), and the system will automatically convert it when reading and writing to tags.


Dataset Module Performance

The Dataset Module's performance depends on many factors, including database performance, network latency, and the complexity of executing SQL queries. The platform will minimize overhead and execute queries as efficiently as possible. However, ultimately, the performance of the Dataset module is tied to these external factors. It's essential to design your database schema and queries with performance in mind and consider investing in high-quality hardware and network infrastructure to ensure optimal performance.


Error Handling

Error handling in the Dataset module is straightforward. If an error occurs during the execution of a command, the error message will update the module's Error property (Last Status). You can monitor this property to handle errors in your application. Furthermore, if an error occurs during the execution of a synchronous method, the process will return an empty Data Table and update the Error property. Alternatively, you can call methods like SelectCommandWithStatus, where the status will be an output parameter in the method.

These are some key points you should keep in mind while using the Dataset module. It's a powerful tool that provides a lot of flexibility. However, it also requires careful use and understanding of its features and limitations. Always keep best practices in mind when designing and implementing your database interactions. Feel free to seek expert advice if you need clarification on anything.


Backup of SQLite Databases

There are several ways to back up SQLite databases in a solution. One simple method is to make a copy of the database file itself, but this method may not be suitable if the database is actively being written to or is very large. Here are a few suggestions for backing up SQLite databases in your solution:

  • SQLite provides a built-in backup command that can be used to create a backup of a database file. The syntax for this command is as follows:

    Code Block
    sqlite3 source_database.sqlite .backup backup_file.sqlite

    This command creates a backup of the source database and stores it in a new file called backup_file.sqlite. You can then copy this file to a safe location for backup purposes.

  • SQLite also provides an Online Backup API that you can use to create a backup of a database file while your application is still using it. This API offers several benefits over the backup command, such as the ability to perform incremental backups and monitor the progress of the backup process. Read more in the SQLite Backup API documentation.

Regardless of your chosen method, it's crucial to store your backup files in a secure location and test them regularly to ensure that you can use them to restore your database if needed.


In this section:

In this section...

Page Tree
root@parent
spacesV10