Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Overview

The Module Dataset is composed by comprises the The Datasets Data Server Service that , which will receive the request requests from the various modules, access the data sources, and reply  to to the requests. 

In order to To fully leverage the Module Datasets is important essential to understating various advanced aspects such as: managing

  • Managing parallel requisitions
, doing
  • .
  • Doing synchronous or asynchronous requests
, mapping
  • .
  • Mapping data to
tags in the
  • a server
side
  • or
in the
  • client-side tags, the application, performance, and
others
  • other items. 

If Suppose you are a new user , or working in on simple projects, it's not necessary to review . In that case, reviewing all these concepts before using the Dataset Module is unnecessary.  But if But suppose you are deploying projects in production that required require data exchange of data with SQL database we recommended the . In that case, we recommend that you review these these topics and how they may apply to specific solution architecture. 

On this page:

Table of Contents
maxLevel3


Execution Process and Databases Connections

The Dataset module, specifically the "TRunModule.exe (Dataset)" process, reads all project configurations relevant to this module but does not directly interface with the databases. Database access services are provided by TServer, with the Dataset module consuming these services. The execution of the Dataset module, i.e., executions of configurations made in Queries, Tables, and Files, runs in an independent process "TRunModule.exe (Dataset)" that reads all the project configurations related to this module but does not directly execute the interface with the databases. Within TServer, there are services to access the database, and connections and calls to the database occur in TServer. The Dataset module merely consumes this bank access service from the TServer

Databases Connections

with the Database.

For each configured DB, there's only a single one access connection is created for this database, meaning that there's only one execution thread for each DB. If Suppose parallel execution of commands to the same database is needed, it's possible to create . In that case, creating more than one DB pointing to the same database is possible. The Dataset module is a vital system component of the system, functioning independently via the process " TRunModule.exe (Dataset)". It's important to understand that this module does not directly interact with the databases. However, but ratherinstead, it reads project configurations related to the module and utilizes services within TServer to access databases. As a result, all database connections and calls occur in TServer.


Database Interactions: Synchronous, Asynchronous and Distributed Access

The Dataset module provides two key critical methods for executing data requests: 

  • Synchronous
and
  • Asynchronous

. Understanding the differences between these methods, their default behaviors, and when to use each one is crucial for optimizing performance and avoiding potential bottlenecks in your project.

Asynchronous Execution

This method is execution is generally more performant and can help avoid potential locks. When properties in both Tables and Queries, such as the "Select" property, are triggered, the execution becomes Asynchronous. After the trigger, the execution usually continues normally while in parallel, the property is propagated from the trigger point (be it a screen or a script) to the server in parallel. It then synchronizes with the Dataset module, which calls upon the necessary services from TServer to interface with the database. Upon TServer's return with the execution of the Select for the Dataset module, the result is updated in the properties and attributes of the module. It then undergoes mapping treatment with tags and is subsequently made available to the entire project.

Synchronous Execution

Certain On the other hand, certain methods have a Synchronous Execution. For example, when calling the SelectCommand method, the execution is paused at this point until the Dataset module carries out the service call to TServer, which interfaces with the database and returns the result to the Dataset module. The Dataset module then performs tag mapping and returns the result, allowing the execution to resume.

It's important essential to exercise caution when making synchronous calls within mono-threaded code-behind screens, whether they are WPF or HTML5. While synchronous execution can be appropriate in certain casessome instances, overuse can lead to performance issues. Mechanisms like the Script task or Action Server Request can facilitate asynchronous execution without the need to use attributes, which is particularly useful when the result needs to be handled even after an asynchronous execution has been completed.

Client vs

.

Server Tags in Mapping

Requests and

Results

In the Dataset Module, it's possible to utilize tags from the client domain during tag mapping. This capability provides additional flexibility when managing data in your project. When executing a synchronous or asynchronous command, the Dataset module, not the TServer, carries out the tag assignment within the original domain of the call. This means that if Suppose a call is initiated from a client (e.g., a screen or client script). In that case, the mapping operation between the received values and the tags occurs after TServer returns the result to the Dataset module. In this scenario, client domain tags can be used in the mapping process. The choice of tag domain is important essential for controlling the scope of data availability. If you wish the data to be available project-wide, server domain tags are the appropriate choice. However, if the data's scope should be confined to the client that initiated the call, client domain tags would be the best fit.

This detailed understanding of the functionality of client and server domain tags in the Dataset Module's tag mapping process can enhance your project's data management efficiency and effectiveness. It's crucial to select Selecting the correct tag domain based on your project's specific requirements is crucial, as it significantly impacts data accessibility and overall project performance.


Managing Data and Concurrency within the Dataset Module

Utilizing DataTable Type Tags

The Dataset Module's Select methods return a .Net DataTable object. This object can be directly manipulated in the script or assigned to a tag of the DataTable type, enabling its use across different modules like clients or reports.

In scenarios where mapping is not employed to assign specific row column values to Tags, the returned DataTable can be directly manipulated within the script. Alternatively, by assigning the DataTable object to a DataTable type tag, the results can be propagated to other modules. This allows the data to be conveniently presented in data grids, pie charts, or even XY chart presentations via the CallBack GetSamples method.

Furthermore, Scripts provide methods (TK namespace) that facilitate the conversion between an Array of Template Tags and DataTables, and vice versa, or even to .NET objects. Certain graphical elements, such as DataGrids, handle the data source similarly, whether it's a DataTable type tag, the result of a Dataset.SQL.Query command, or an array of Template Tags.

Understanding and efficiently utilizing this aspect of the Dataset Module is pivotal for effective query result management and distribution across your project.

Server Domain Attributes in Dataset Module

It's crucial to understand that all the attributes within the Dataset Module are server domain. Consequently, when multiple tasks or clients interact with the same attributes, there can be competition, which may cause confusion during execution.

Consider a scenario where a client and a Task are setting the SQLStatement of a query simultaneously, each providing a different value. Given that these attributes are in the server domain, the last value set in the SQLStatement will be the one used during command execution. This rule applies even when triggering a property that is also in the server domain, such as the Select property.

Understanding this aspect of the Dataset Module is essential for managing concurrent operations effectively, avoiding potential conflicts and ensuring the correct value is used in operations.

Understanding Dataset Module Properties

The Dataset module consists of numerous server domain properties, including AsyncContents, a .Net DataTable type property. Upon triggering the Select property, which initiates an asynchronous execution, the values from each column of the first row are assigned to the tags configured in the Mapping.

When the Next property is triggered, the Dataset module navigates to the succeeding row, assigning the column values to the respective tags as per the Mapping configuration. The DataTable used for this navigation process is stored in the AsyncContents property, which is updated each time a Select command is executed.

Furthermore, the CursorIndex property can be manipulated to jump to or reverse to a specific position in the row navigation, providing enhanced control over data traversal.

For synchronous commands, the dataTable is returned by the method and simultaneously stored in the LocalContents property. Each execution of a synchronous command overwrites the LocalContents property with its result. It's important to note this behavior to ensure data integrity during synchronous operations.

Understanding these properties and their functions is crucial for effective use of the Dataset module in managing data operations in your project.


Data Management: Store & Forward, Traffic Volume, and SQL Syntax

Store & Forward Considerations

Store & Forward built-in functionality is strictly applicable to databases used for storing Historian and Alarm data. This feature relies on specific control columns found in the schemas of Historian and Alarm, which are not present in generic databases. For comprehensive details on this functionality, please refer to the Store&Forward section in the Historian and Alarm modules' documentation.

Managing Data Traffic Volume

The Dataset>Table's WhereCondition property, much like the Dataset>Query's SQLStatement property, defines the return condition when executing a select command in the database. Consequently, the volume of returned data can vary dramatically - from no row to the entire table (in the absence of a filter or a WhereCondition). There are no software-imposed limitations on the size of the returned DataTable, but larger tables require more allocated memory and prolong the traffic time between the TServer, Dataset module, and the final destination. Be mindful of the scalability of these tables and ensure your hardware can support the anticipated data volume.

Adhering to Database-Specific SQL Syntax

The Dataset>Queries' SQLStatement property enables the execution of a complete SQL statement, which means SQL syntax specific to each database can be used. For example, SQLite supports the LIMIT clause, while SQLServer uses TOP. Similarly, different separators are used for table and column names across databases. While the software attempts to normalize and use the correct separators when using WhereCondition and filters on objects like alarm objects, in SQLStatement, it's the user's responsibility to use the correct syntax.


Security Measures: Preventing SQL Injection

Preventing SQL Injection Attacks

When calling Stored Procedures in the database, there's a significant concern about SQL injection, especially if parameters are passed as plain text in SQL statements. To mitigate this, we utilize the .Net API where parameters are added to a list, effectively preventing code injection. Therefore, we advise using parameters in Stored Procedure calls with the following syntax:
"execute <StoredProcedureName> <@paramName1>={Tag.Tag1}, <@paramName2>="{Tag.TagStr}", <@paramName3>={Tag.OutValue} OUTPUT, ... , @RETURN_VALUE={Tag.RetValue} RETURNVALUE"
This approach parameterizes Stored Procedure calls and guards against SQL injection attacks. Each parameter value is strictly interpreted as a value and not part of the SQL command, providing a critical layer of security against unauthorized database manipulation.


Network GatewayAccess and Time Zone Handling

Using ServerIP for Network Gateway Database Access

In some scenarios, certain databases might not be accessible from all computers due network topology or security constrains. The ServerIP column can redirect database access commands to a machine with the necessary permissions, provided that our software is installed on that machine. In such cases, only TWebServer would be active, handling database interaction, receiving commands, establishing connections, and so forth. This method offers a workaround for situations where database access is restricted to specific machines.

Handling Time Zones

All Date/Time type tags in our software are treated as UTC. Consequently, the dates written to and read from the standard database, such as historical data and alarms, are also in UTC. If it's necessary to read/write dates in the local timezone, account for the time difference when preparing the SQL statement or using the WhereCondition property. Other external data tables, from third parties or the client, can use any kind of timezone. To assist in handling these data, you can configure the "DateTimeMode" column with the type of timezone used in the database (UTC or LocalTime), and the system will automatically convert it when reading and writing to tags


Dataset Module Performance

The Dataset module's performance depends on many factors, including database performance, network latency, and the complexity of the SQL queries being executed. The software itself is designed to minimize overhead and execute queries as efficiently as possible, but ultimately the performance of the Dataset module is tied to these external factors. It's important to design your database schema and queries with performance in mind, and consider investing in high-quality hardware and network infrastructure to ensure optimal performance.


Error Handling

Error handling in the Dataset module is straightforward. If an error occurs during the execution of a command, the module's Error property (LastStatus) will be updated with the error message. This property can be monitored to handle errors in your application. Furthermore, if an error occurs during the execution of a synchronous method, the method will return an empty DataTable and the Error property will be updated or you can call methods like SelectCommandWithStatus where the status will be an output parameter in the method.

These are some of the key points you should keep in mind while using the Dataset module. It's a powerful tool that provides a lot of flexibility, but it also requires careful use and understanding of its features and limitations. Always keep best practices in mind when designing and implementing your database interactions, and don't hesitate to seek expert advice if you're unsure about anything.


Backup of Project SQLite Data Bases

There are several ways to backup SQLite databases in a project. One simple way is to make a copy of the database file itself, but this method may not be suitable if the database is actively being written to or is very large.


Here are a few suggestions for backing up SQLite databases in your project:

SQLite backup command: SQLite provides a built-in backup command that can be used to create a backup of a database file. The syntax for this command is as follows:


"sqlite3 source_database.sqlite .backup backup_file.sqlite"

This command creates a backup of the source database and stores it in a new file called backup_file.sqlite. You can then copy this file to a safe location for backup purposes.

SQLite Online Backup API: SQLite also provides an Online Backup API that can be used to create a backup of a database file while it is still being used by your application. This API provides several benefits over the backup command, such as the ability to perform incremental backups and the ability to monitor the progress of the backup process.

Here is a link to the SQLite Online Backup API documentation: SQLite Backup API

Regardless of the method you choose, it's important to ensure that your backup files are stored in a secure location and that they are tested regularly to ensure that they can be used to restore your database if needed.


In this section...

Page Tree
root@parent
spacesV10