Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Overview

The Module Dataset is composed by the The Datasets Data Server Service that will receive the request from the various modules, access the data sources, and reply  to the requests. 

In order to To fully leverage the Datasets Module Datasets is important essential to understating various advanced aspects such as: managing

  • Managing parallel requisitions
, doing
  • .
  • Doing synchronous or asynchronous requests
, mapping
  • .
  • Mapping data to
tags in the
  • a server
side
  • or
in the
  • client-side tags, the application, performance, and
others
  • other items. 

If you are new user, or in simple projects, it's not necessary to review all those concepts up front, before using the Dataset Module.  But, if you are deploying projects in production that required exchange of data with SQL database is recommended the you review those topics and how they may apply to specific solution architecture. On this page, you findadvanced topics to better understand these aspects of the Datasets Module.

On this page:

Table of Contents
maxLevel3
stylenone


Datasets Module Execution

The Dataset module facilitates efficient database interactions by utilizing TServer services. While it doesn't directly interact with databases, it manages synchronous and asynchronous executions for optimal performance. Understand the differences between these methods by reading more on the Datasets Module Execution page.


Data Management

The Dataset Module offers versatile methods for managing data and concurrency within solutions, including Data Table tags and Async Contents. Understand how to avoid conflicts and optimize performance, reading more on the Data Management page.


Datasets Runtime Attributes

The Datasets Namespace exposes properties and methods from the .NET objects used by the Historian Module execution. You can use these properties and methods on your Displays or to create Scripts and Alarms. The Dataset Runtime Attributes page lists all options available.


Preventing SQL Injections

When calling Stored Procedures in the database, there's a significant concern about SQL injection, especially when passing parameters as plain text in SQL statements. To mitigate this, we utilize the .NET API, Synchronous and Asynchronous Methods 
Tags Domain Client Used in Mapping
Use of Tag DataTable 
The return of the Select methods is a .Net DataTable object. If you're not using mapping to assign each value from a specific row's columns to Tags, you can manipulate this returned dataTable directly in the script, or you can assign it to a Tag of the DataTable type. This allows the result to propagate to other modules, like clients or reports, and be easily presented in a dataGrid, PieChart, or even for XY chart presentation using the CallBack GetSamples.
DataTable Properties Available in the Dataset Module
As mentioned, there are several server domain properties within the Dataset module. There is a property called AsyncContents, which is a .Net DataTable type. When the Select property is triggered, initiating an asynchronous execution, after the return, the values of each column of the 1st row are assigned to the tags that were configured in the Mapping. When the Next property is triggered, the Dataset module navigates to the next row and assigns the column values to the tags, even those configured in the Mapping. The table used for this navigation is stored in the AsyncContents property and is updated only when the select is executed. It's possible to modify the CursosIndex to jump or go back to a specific position in the row navigation.
For synchronous commands, the method returns the dataTable, but it's also stored in the LocalContents property; it will be overwritten with the result of each executed command.
Store&Forward
Only banks used to store Historian and Alarm can have the Store&Forward functionality enabled because this feature requires the use of some control columns that exist in the schema of Historian and Alarm, and not in generic banks. Please refer to the Store&Forward section of the Historian and Alarm modules for more details.
Data Traffic Volume Between TServer, Dataset Module, and Other Modules
When a select is executed in the bank, the WhereCondition property of the Dataset>Table defines the return condition just like the SQLStatement property of the Dataset>Query. Therefore, the volume of data that can be returned can vary from no row to the entire table (when there's no filter or a WhereCondition). There is no software-side protection concerning the size of the return DataTable, and obviously, the larger the table size, the more allocated memory, the longer the traffic time of this table between TServer and the Dataset module, and then its final destination. Care should be taken with the scalability of these tables' size, as well as having suitable hardware to support the expected data volume.
Specific Syntax Used in the SQLStatement of Database
The SQLStatement property of Dataset>Queries allows for the full SQL statement to be executed, therefore it's possible to use SQL syntax specific to each bank. For example, SQLite supports the LIMIT clause, which in SQLServer is called TOP. Just as care must be taken with the separators of table and column names, some use [] others "". As much as possible, the software will try to normalize and use the correct separators when using the WhereCondition and filters on objects with the alarm object, but in SQLStatement, it's up to the user to use the correct syntax.
Protection Against SQL Injection 
In the database, calling Stored Procedure, there is a great concern about this part of "injection," because if parameters are passed as plain text in SQL Statement, the "injection" could be possible. Against this, we use the .Net API where parameters are added to a list, making it impossible to effectively preventing code injection. Therefore, it's advisable to use we advise using parameters in the Stored Procedures Procedure calls following with the following syntax: 

Code Block
"
execute <StoredProcedureName> <@paramName1>={Tag.Tag1}, <@paramName2>="{Tag.TagStr}", <@paramName3>={Tag.OutValue} OUTPUT, ... , @RETURN_VALUE={Tag.RetValue} RETURNVALUE

This approach parameterizes "By using this syntax, you effectively parameterize the Stored Procedure calls and prevent guards against SQL injection attacks. This way, each Each parameter value is strictly interpreted strictly as a value and not part of the SQL command. This practice is fundamental to the security of your application, preventing unauthorized manipulation of the database.
Using ServerIP to Access Remote Database Providers 
There are times when certain databases are not accessible from any computer. By using the ServerIP column, it's possible to , providing a critical layer of security against unauthorized database manipulation.


Network Gateway Access And Time Zone Handling

In some scenarios, specific databases might not be accessible from all computers due to network topology or security constraints. The ServerIP column can redirect database access commands to a computer that has the required permissions. The prerequisite for this is that the software should also be installed on that computer. In this scenario, only TWebServer would be running. This system is responsible for interfacing with the databasemachine with the necessary permissions, provided you install our platform on that machine. Only TWebServer would be active in such cases, handling database interaction, receiving commands, and establishing connections, etc. This method provides offers a workaround for situations where database access is restricted to specific machines.

Handling Time Zones

The platform treats all Date and Zones 
All Date/ Time type tags in the software are treated as UTC. ThereforeConsequently, the dates that are written to and read in from the standard database, such as historical data and alarms, are also in UTC. If you want it is necessary to read /and write a date dates in a the local timezone, you should consider account for the time difference when preparing the SQL statement or when using the WhereCondition property. Other external data tables, from third parties or the client, can use any timezone. To assist in handling this data, you can configure the DateTimeMode column with the type of timezone used in the database (UTC or Local Time), and the system will automatically convert it when reading and writing to tags.


Dataset Module

Performance 

Performance

The Dataset moduleModule's performance depends on many factors, including database performance, network latency, and the complexity of the executing SQL queries being executed. The software itself is designed to platform will minimize overhead and execute queries as efficiently as possible. However, but ultimately, the performance of the Dataset module is tied to these external factors. It's important essential to design your database schema and queries with performance in mind , and consider investing in high-quality hardware and network infrastructure to ensure optimal performance.


Error

Handling 

Handling

Error handling in the Dataset module is straightforward. If an error occurs during the execution of a command, the error message will update the module's Error property (LastStatus) will be updated with the error message. This property can be monitored Last Status). You can monitor this property to handle errors in your application. Furthermore, if an error occurs during the execution of a synchronous method, the method process will return an empty DataTable Data Table and update the Error property will be updated or . Alternatively, you can call methods like SelectCommandWithStatus, where the status will be an output parameter in the method.

These are some of the key points you should keep in mind while using the Dataset module. It's a powerful tool that provides a lot of flexibility. However, but it also requires careful use and understanding of its features and limitations. Always keep best practices in mind when designing and implementing your database interactions, and don't hesitate . Feel free to seek expert advice if you 're unsure about need clarification on anything.

Execution Process

The Dataset module, specifically the "TRunModule.exe (Dataset)" process, reads all project configurations relevant to this module but does not directly interface with the databases. Database access services are provided by TServer, with the Dataset module consuming these services.

The execution of the Dataset module, i.e., executions of configurations made in Queries, Tables, and Files, runs in an independent process "TRunModule.exe (Dataset)" that reads all the project configurations related to this module but does not directly execute the interface with the databases. Within TServer, there are services to access the database, and connections and calls to the database occur in TServer. The Dataset module merely consumes this bank access service from TServer

Connections with the Database.

For each configured DB, there's only a single access connection created for this database, meaning that there's only one execution thread for each DB. If parallel execution of commands to the same database is needed, it's possible to create more than one DB pointing to the same database.

For each configured database (DB), there's only one access connection. This signifies a single execution thread per DB. However, for parallel execution of commands to the same database, additional DBs can be configured, all pointing to the same database.

Synchronous and Asynchronous Methods in Tables and Queries

Both in Tables and in Queries, if the properties are used to trigger the execution, such as triggering the "Select" property, this will make the execution Asynchronous. In other words, after the trigger in the property, the execution will continue normally, and in parallel, this property will propagate from where it suffered the trigger (screen or script) to the server, then being synchronized with the Dataset module which in turn will make the necessary calls to the TServer's bank access service. After the return of the TServer with the execution of the Select for the Dataset module, the result will be updated in the properties and attributes of the module, the mapping treatment with tags will be done, and the result will be available to the entire project.

As for the available methods, the execution will be synchronous. For example, when calling the SelectCommand method, the execution will be locked in this method while the Dataset module executes the call of the TServer service, which interfaces with the database returning the result to the Dataset module. The Dataset module then processes the mapping with tags and only then provides the result in the form of the method's return, allowing the continuation of the execution.

Therefore, it's important to be careful with synchronous calls within code-behind screens, which are mono-thread, whether they are WPF or HTML5 screens. Mechanisms like using Script task or Action Server Request can help make an asynchronous execution without needing to use the attributes, when there's a need to handle the result even after an asynchronous execution.

-------------------

The properties in Tables and Queries can trigger execution, such as triggering the "Select" property, which makes the execution asynchronous. This means that after the trigger in the property, the execution continues normally. In parallel, this property propagates from its trigger source (screen or script) to the server, then synchronizes with the Dataset module. The module then makes necessary calls to TServer's bank access service. After TServer's return with the Select execution for the Dataset module, the result gets updated in the module's properties and attributes, is processed for tag mapping, and is finally made available to the entire project.

Contrarily, available methods execute synchronously. For instance, calling the SelectCommand method locks the execution at this method while the Dataset module executes the call of the TServer service, interfaces with the database, and returns the result to the Dataset module. The Dataset module then processes the tag mapping and provides the result in the form of the method's return, allowing the continuation of the execution.

Special caution is required with synchronous calls within code-behind screens (mono-thread, WPF or HTML5 screens). Mechanisms like using Script task or Action Server Request can help create an asynchronous execution without needing to use attributes, when there's a need to handle the result even after an asynchronous execution.

Usage of Client-Type Tags in the Dataset Module's Tag Mapping

It's possible to use client-type tags in the Dataset module's tag mapping. When executing a synchronous or asynchronous command, the assignment in the tags will be executed by the Dataset module (not by TServer) in the original domain of the call. Therefore, if the call was executed on a client (screen or client script), after the return of the result from TServer to the Dataset module, the mapping will be executed between received values and tags, and Client domain Tags can be used in the mapping. It's important to understand and consider if the project's need is an execution that makes the value available to the entire project, in this case using server domain tags, or only for the client's scope, in this case using client domain tags.

----------------

Client-type tags can be used in the Dataset module's tag mapping. During execution of a synchronous or asynchronous command, the assignment in the tags gets executed by the Dataset module (not by TServer) in the original domain of the call. Hence, if the call was executed on a client (screen or client script), after the result's return from TServer to the Dataset module, the mapping between received values and tags gets executed. Client domain Tags can be used in the mapping, which allows for specific control over the scope of availability of the value - either for the entire project (using server domain tags) or just for the client's scope (using client domain tags).

Attributes and Server Domain Properties (Concurrency Management)

Attributes and Server Domain Properties (concurrence management)
It's important to know and remember that the module's attributes are all server domain. Therefore, if there are multiple Tasks or clients using the same attributes, there may be competition and confusion in execution. For instance, a client sets the SQLStatement of a query, and at the same time, a Task also sets this SQLStatement with a different value from the client. During the command execution (for example, a select – triggering the Select property which is also a server domain), the last value in the SQLStatement is what will be used in the operation.

-----------

All module attributes are server domain. Therefore, if multiple Tasks or clients are using the same attributes, there might be competition and confusion in execution. For example, if a client and a Task both set the SQLStatement of a query simultaneously with different values, the last value in the SQLStatement will be used in the operation.

Use of Tags of type DataTable

It's important to know and remember that the module's attributes are all server domain. Therefore, if there are multiple Tasks or clients using the same attributes, there may be competition and confusion in execution. For instance, a client sets the SQLStatement of a query, and at the same time, a Task also sets this SQLStatement with a different value from the client. During the command execution (for example, a select – triggering the Select property which is also a server domain), the last value in the SQLStatement is what will be used in the operation.

----------

The return of Select methods is a .Net DataTable object. This can be manipulated directly in the script or assigned to a Tag of the DataTable type, allowing the result to propagate to other modules like clients or reports.

Dataset Module Properties

There are several server domain properties within the Dataset module, including a .Net DataTable type property called AsyncContents. When the Select property is triggered, initiating an asynchronous execution, the column values of the first row are assigned to the tags configured in the Mapping


Backup of SQLite Databases

There are several ways to back up SQLite databases in a solution. One simple method is to make a copy of the database file itself, but this method may not be suitable if the database is actively being written to or is very large. Here are a few suggestions for backing up SQLite databases in your solution:

  • SQLite provides a built-in backup command that can be used to create a backup of a database file. The syntax for this command is as follows:

    Code Block
    sqlite3 source_database.sqlite .backup backup_file.sqlite

    This command creates a backup of the source database and stores it in a new file called backup_file.sqlite. You can then copy this file to a safe location for backup purposes.

  • SQLite also provides an Online Backup API that you can use to create a backup of a database file while your application is still using it. This API offers several benefits over the backup command, such as the ability to perform incremental backups and monitor the progress of the backup process. Read more in the SQLite Backup API documentation.

Regardless of your chosen method, it's crucial to store your backup files in a secure location and test them regularly to ensure that you can use them to restore your database if needed.


In this section:

In this section...

Page Tree
root@parent
spacesV10