Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Overview

The Module Dataset comprises the Datasets Data Server Service, which will receive the requests from the various modules, access the data sources, and reply to the requests. 

To fully leverage the Datasets Module Datasets is essential to understating various advanced aspects such as:

  • Managing parallel requisitions.
  • Doing synchronous or asynchronous requests.
  • Mapping data to a server or client-side tags, the application, performance, and other items. 

Suppose you are a new user or working on simple projects. In that case, reviewing all these concepts before using the Dataset Module is unnecessary. But suppose you are deploying projects in production that require data exchange with SQL database. In that case, we recommend that you review these topics and how they may apply to specific solution architecture. On this page, you findadvanced topics to better understand these aspects of the Datasets Module.

On this page:

Table of Contents
maxLevel3
stylenone


Datasets Module Execution

Execution Process and Databases Connections

The Dataset module , specifically the "TRunModule.exe (Dataset)" process, reads all project configurations relevant to this module but does not directly interface with the databases. Database access services are provided by TServer, with the Dataset module consuming these services. The execution of the Dataset module, i.e., executions of configurations made in Queries, Tables, and Files, runs in an independent process "TRunModule.exe (Dataset)" that reads all the project configurations related to this module but does not directly execute the interface with the databases. Within TServer, there are services to access the database, and connections and calls to the database occur in TServer. The Dataset module merely consumes this bank access service from the TServer

Databases Connections

For each configured DB, only one access connection is created for this database, meaning that there's only one execution thread for each DB. Suppose parallel execution of commands to the same database is needed. In that case, creating more than one DB pointing to the same database is possible. The Dataset module is a vital system component, functioning independently via TRunModule.exe (Dataset). It's important to understand that this module does not directly interact with the databases. However, instead, it reads project configurations related to the module and utilizes services within TServer to access databases. As a result, all database connections and calls occur in TServer.

Database Interactions: Synchronous, Asynchronous and Distributed Access

The Dataset module provides two critical methods for executing data requests: 

  • Synchronous
  • Asynchronous

Understanding the differences between these methods, their default behaviors, and when to use each is crucial for optimizing performance and avoiding potential bottlenecks in your project.

Asynchronous Execution

This method is generally more performant and can help avoid potential locks. When properties in both Tables and Queries, such as the "Select" property, are triggered, the execution becomes Asynchronous. After the trigger, the execution usually continues while the property is propagated from the trigger point (a screen or a script) to the server in parallel. It then synchronizes with the Dataset module, which calls upon the necessary services from TServer to interface with the database. Upon TServer's return with the execution of the Select for the Dataset module, the result is updated in the properties and attributes of the module. It then undergoes mapping treatment with tags and is made available to the entire project.

Synchronous Execution

Certain methods have a Synchronous Execution. For example, when calling the SelectCommand method, the execution is paused at this point until the Dataset module carries out the service call to TServer, which interfaces with the database and returns the result to the Dataset module. The Dataset module then performs tag mapping and returns the result, allowing the execution to resume.

It's essential to exercise caution when making synchronous calls within mono-threaded code-behind screens, whether WPF or HTML5. While synchronous execution can be appropriate in some instances, overuse can lead to performance issues. Mechanisms like the Script task or Action Server Request can facilitate asynchronous execution without the need to use attributes, which is particularly useful when the result needs to be handled even after an asynchronous execution has been completed.

Client vs Server Tags in Mapping Results

In the Dataset Module, it's possible to utilize tags from the client domain during tag mapping. This capability provides additional flexibility when managing data in your project. When executing a synchronous or asynchronous command, the Dataset module, not the TServer, carries out the tag assignment within the original domain of the call. Suppose a call is initiated from a client (e.g., a screen or client script). In that case, the mapping operation between the received values and the tags occurs after TServer returns the result to the Dataset module. In this scenario, client domain tags can be used in mapping. The choice of tag domain is essential for controlling the scope of data availability. If you wish the data to be available project-wide, server domain tags are the appropriate choice. However, if the data's scope should be confined to the client that initiated the call, client domain tags would be the best fit.

This detailed understanding of the functionality of client and server domain tags in the Dataset Module's tag mapping process can enhance your project's data management efficiency and effectiveness. Selecting the correct tag domain based on your project's specific requirements is crucial, as it significantly impacts data accessibility and overall project performance.

Managing Data and Concurrency within the Dataset Module

Utilizing DataTable Type Tags

The Dataset Module's Select methods return a .Net DataTable object. This object can be directly manipulated in the script or assigned to a tag of the DataTable type, enabling its use across different modules like clients or reports.

In scenarios where mapping is not employed to assign specific row column values to Tags, the returned DataTable can be directly manipulated within the script. Alternatively, by assigning the DataTable object to a DataTable type tag, the results can be propagated to other modules. This allows the data to be conveniently presented in data grids, pie charts, or even XY chart presentations via the CallBack GetSamples method.

Furthermore, Scripts provide methods (TK namespace) that facilitate the conversion between an Array of Template Tags and DataTables, and vice versa, or even to .NET objects. Certain graphical elements, such as DataGrids, handle the data source similarly, whether it's a DataTable type tag, the result of a Dataset.SQL.Query command, or an array of Template Tags.

Understanding and efficiently utilizing this aspect of the Dataset Module is pivotal for effective query result management and distribution across your project.

Server Domain Attributes in Dataset Module

It's crucial to understand that all the attributes within the Dataset Module are server domain. Consequently, when multiple tasks or clients interact with the same attributes, there can be competition, which may cause confusion during execution.

Consider a scenario where a client and a Task are setting the SQLStatement of a query simultaneously, each providing a different value. Given that these attributes are in the server domain, the last value set in the SQLStatement will be the one used during command execution. This rule applies even when triggering a property that is also in the server domain, such as the Select property.

Understanding this aspect of the Dataset Module is essential for managing concurrent operations effectively, avoiding potential conflicts and ensuring the correct value is used in operations.

Understanding Dataset Module Properties

The Dataset module consists of numerous server domain properties, including AsyncContents, a .Net DataTable type property. Upon triggering the Select property, which initiates an asynchronous execution, the values from each column of the first row are assigned to the tags configured in the Mapping.

When the Next property is triggered, the Dataset module navigates to the succeeding row, assigning the column values to the respective tags as per the Mapping configuration. The DataTable used for this navigation process is stored in the AsyncContents property, which is updated each time a Select command is executed.

Furthermore, the CursorIndex property can be manipulated to jump to or reverse to a specific position in the row navigation, providing enhanced control over data traversal.

For synchronous commands, the dataTable is returned by the method and simultaneously stored in the LocalContents property. Each execution of a synchronous command overwrites the LocalContents property with its result. It's important to note this behavior to ensure data integrity during synchronous operations.

Understanding these properties and their functions is crucial for effective use of the Dataset module in managing data operations in your project.

Data Management: Store & Forward, Traffic Volume, and SQL Syntax

Store & Forward Considerations

Store & Forward built-in functionality is strictly applicable to databases used for storing Historian and Alarm data. This feature relies on specific control columns found in the schemas of Historian and Alarm, which are not present in generic databases. For comprehensive details on this functionality, please refer to the Store&Forward section in the Historian and Alarm modules' documentation.

Managing Data Traffic Volume

The Dataset>Table's WhereCondition property, much like the Dataset>Query's SQLStatement property, defines the return condition when executing a select command in the database. Consequently, the volume of returned data can vary dramatically - from no row to the entire table (in the absence of a filter or a WhereCondition). There are no software-imposed limitations on the size of the returned DataTable, but larger tables require more allocated memory and prolong the traffic time between the TServer, Dataset module, and the final destination. Be mindful of the scalability of these tables and ensure your hardware can support the anticipated data volume.

Adhering to Database-Specific SQL Syntax

The Dataset>Queries' SQLStatement property enables the execution of a complete SQL statement, which means SQL syntax specific to each database can be used. For example, SQLite supports the LIMIT clause, while SQLServer uses TOP. Similarly, different separators are used for table and column names across databases. While the software attempts to normalize and use the correct separators when using WhereCondition and filters on objects like alarm objects, in SQLStatement, it's the user's responsibility to use the correct syntax.

facilitates efficient database interactions by utilizing TServer services. While it doesn't directly interact with databases, it manages synchronous and asynchronous executions for optimal performance. Understand the differences between these methods by reading more on the Datasets Module Execution page.


Data Management

The Dataset Module offers versatile methods for managing data and concurrency within solutions, including Data Table tags and Async Contents. Understand how to avoid conflicts and optimize performance, reading more on the Data Management page.


Datasets Runtime Attributes

The Datasets Namespace exposes properties and methods from the .NET objects used by the Historian Module execution. You can use these properties and methods on your Displays or to create Scripts and Alarms. The Dataset Runtime Attributes page lists all options available.


Preventing SQL Injections

Security Measures: Preventing SQL Injection

Preventing SQL Injection Attacks

When calling Stored Procedures in the database, there's a significant concern about SQL injection, especially if when passing parameters are passed as plain text in SQL statements. To mitigate this, we utilize the .Net NET API, where parameters are added to a list, effectively preventing code injection. Therefore, we advise using parameters in Stored Procedure calls with the following syntax:"

Code Block
execute <StoredProcedureName> <@paramName1>={Tag.Tag1}, <@paramName2>="{Tag.TagStr}", <@paramName3>={Tag.OutValue} OUTPUT, ... , @RETURN_VALUE={Tag.RetValue} RETURNVALUE
"

This approach parameterizes Stored Procedure calls and guards against SQL injection attacks. Each parameter value is strictly interpreted as a value and not part of the SQL command, providing a critical layer of security against unauthorized database manipulation.


Network

GatewayAccess and

Gateway Access And Time Zone Handling

Using ServerIP for Network Gateway Database Access

In some scenarios, certain specific databases might not be accessible from all computers due to network topology or security constrainsconstraints. The ServerIP column can redirect database access commands to a machine with the necessary permissions, provided that our software is installed you install our platform on that machine. In such cases, only Only TWebServer would be active in such cases, handling database interaction, receiving commands, and establishing connections, and so forth. This method offers a workaround for situations where database access is restricted to specific machines.

Handling Time Zones

All Date/The platform treats all Date and Time type tags in our software are treated as UTC. Consequently, the dates written to and read from the standard database, such as historical data and alarms, are also in UTC. If it 's is necessary to read /and write dates in the local timezone, account for the time difference when preparing the SQL statement or using the WhereCondition property. Other external data tables, from third parties or the client, can use any kind of timezone. To assist in handling these this data, you can configure the " DateTimeMode " column with the type of timezone used in the database (UTC or LocalTimeLocal Time), and the system will automatically convert it when reading and writing to tags.


Dataset Module Performance

The Dataset moduleModule's performance depends on many factors, including database performance, network latency, and the complexity of the executing SQL queries being executed. The software itself is designed to platform will minimize overhead and execute queries as efficiently as possible. However, but ultimately, the performance of the Dataset module is tied to these external factors. It's important essential to design your database schema and queries with performance in mind , and consider investing in high-quality hardware and network infrastructure to ensure optimal performance.


Error Handling

Error handling in the Dataset module is straightforward. If an error occurs during the execution of a command, the error message will update the module's Error property (LastStatus) will be updated with the error message. This property can be monitored Last Status). You can monitor this property to handle errors in your application. Furthermore, if an error occurs during the execution of a synchronous method, the method process will return an empty DataTable Data Table and update the Error property will be updated or . Alternatively, you can call methods like SelectCommandWithStatus, where the status will be an output parameter in the method.

These are some of the key points you should keep in mind while using the Dataset module. It's a powerful tool that provides a lot of flexibility. However, but it also requires careful use and understanding of its features and limitations. Always keep best practices in mind when designing and implementing your database interactions, and don't hesitate . Feel free to seek expert advice if you 're unsure about need clarification on anything.


Backup of

Project

SQLite

Data Bases

Databases

There are several ways to backup back up SQLite databases in a projectsolution. One simple way method is to make a copy of the database file itself, but this method may not be suitable if the database is actively being written to or is very large. Here are a few suggestions for backing up SQLite databases in your projectsolution:

  • SQLite

backup command: SQLite
  • provides a built-in backup command that can be used to create a backup of a database file. The syntax for this command is as follows:

    Code Block
"
  • sqlite3 source_database.sqlite .
backup backup
  • backup backup_file.sqlite
"
  • This command creates a backup of the source database and stores it in a new file called

 
  • backup_file.sqlite. You can then copy this file to a safe location for backup purposes.

  • SQLite
Online Backup API: SQLite
  • also provides an Online Backup API that you can
be used
  • use to create a backup of a database file while
it
  • your application is still
being used by your application
  • using it. This API
provides
  • offers several benefits over the backup command, such as the ability to perform incremental backups and
the ability to
  • monitor the progress of the backup process.

Here is a link to Online : SQLite Backup API
  • .

Regardless of the your chosen method you choose, it's important crucial to ensure that store your backup files are stored in a secure location and that they are tested test them regularly to ensure that they you can be used use them to restore your database if needed.


In this section

...

:

Page Tree
root@parent
spacesV10