Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Overview

The Module Dataset comprises the Datasets Data Server Service, which will receive the requests from the various modules, access the data sources, and reply to the requests. 

To fully leverage the Module Datasets is essential to understating various advanced aspects such as

  • Managing parallel requisitions.
  • Doing synchronous or asynchronous requests.
  • Mapping data to a server or client-side tags, the application, performance, and other items. 

Suppose you are a new user or working on simple projects. In that case, reviewing all these concepts before using the Dataset Module is unnecessary. But suppose you are deploying projects in production that require data exchange with SQL database. In that case, we recommend that you review these topics and how they may apply to specific solution architecture. On this page, you findadvanced topics to explore to understand these aspects of the Datasets module better.

On this page:

Table of Contents
maxLevel3


Databases Connections And Interactions

The Dataset module facilitates efficient database interactions by utilizing TServer services. While it doesn't directly interact with databases, it manages synchronous and asynchronous executions for optimal performance. Understand the differences between these methods reading more on the Databases Connection And Interactions page.


Managing Data And Concurrency Within The Dataset Module

Utilizing DataTable Type Tags

Data Management

The Dataset Module 's Select methods return a .Net Data Table object. You can directly manipulate this object in the script or assign it to a tag of the Data Table type, enabling its use across different modules like clients or reports. When mapping is not employed to assign specific row column values to Tags, the returned Data Table can be directly manipulated within the script. Alternatively, by assigning the Data Table object to a Data Table type tag, you can propagate the results to other modules, allowing you to present data conveniently in data grids, pie charts, or even XY chart presentations via the Call Back GetSamples  method.

Furthermore, scripts provide methods (TK namespace) that facilitate converting between an array of template tags and data tables and vice versa or even to .NET objects. Certain graphical elements, such as Data Grids, handle the data source similarly, whether it's a Data Table type tag or the result of a Dataset.SQL.Query command or an array of Template Tags.
Understanding and efficiently utilizing this aspect of the Dataset Module is pivotal for effective query result management and distribution across your project.

Server Domain Attributes In Dataset Module

It's crucial to understand that all the attributes within the Dataset Module are server domains. Consequently, when multiple tasks or clients interact with the same qualities, competition can cause confusion during execution. Consider a scenario where a client and a Task set the SQLStatement of a query simultaneously, each providing a different value. Given that these attributes are in the server domain, the last value set in the SQLStatement will be used during command execution. This rule applies even when triggering a property in the server domain, such as the Select property.

Understanding this aspect of the Dataset Module is essential for managing concurrent operations effectively, avoiding potential conflicts, and ensuring it uses the correct value.

Understanding Dataset Module Properties

The Dataset module comprises several server domain properties, including the Async Contents, which is a .Net Data Table type property. When the user triggers the Select property, it initiates an asynchronous execution, and the tags configured in the Mapping are assigned the values from each column of the first row. Upon triggering the Next property, the Dataset module navigates to the succeeding row and assigns the column values to the respective tags according to the Mapping configuration. The Async Contents property stores the Data Table used for this navigation process, which is updated each time a Select command is executed.

Additionally, the user can manipulate the CursorIndex property to jump to or reverse to a specific position in the row navigation, providing enhanced control over data traversal. For synchronous commands, the method returns the Data Table and simultaneously stores it in the Local Contents property. Each execution of a synchronous command overwrites the Local Contents property with its result.

It's important to note this behavior to ensure data integrity during synchronous operations. To effectively manage data operations in your project using the Dataset module, it's crucial to understand these properties and their functions.

Data Management: Store & Forward, Traffic Volume, and SQL Syntax

Store & Forward Considerations

Store & Forward built-in functionality is strictly applicable to databases used for storing Historian and Alarm data. This feature relies on specific control columns found in the schemas of Historian and Alarm, which are not present in generic databases. For comprehensive details on this functionality, please refer to the Store&Forward section in the Historian and Alarm modules' documentation.

Managing Data Traffic Volume

The Dataset>Table's WhereCondition property, much like the Dataset>Query's SQLStatement property, defines the return condition when executing a select command in the database. Consequently, the volume of returned data can vary dramatically - from no row to the entire table (in the absence of a filter or a WhereCondition). There are no software-imposed limitations on the size of the returned DataTable, but larger tables require more allocated memory and prolong the traffic time between the TServer, Dataset module, and the final destination. Be mindful of the scalability of these tables and ensure your hardware can support the anticipated data volume.

Adhering to Database-Specific SQL Syntax

The Dataset>Queries' SQLStatement property enables the execution of a complete SQL statement, which means SQL syntax specific to each database can be used. For example, SQLite supports the LIMIT clause, while SQLServer uses TOP. Similarly, different separators are used for table and column names across databases. While the software attempts to normalize and use the correct separators when using WhereCondition and filters on objects like alarm objects, in SQLStatement, it's the user's responsibility to use the correct syntax.

offers versatile methods for managing data and concurrency within projects, including Data Table tags and Async Contents. Understand how to avoid conflicts and optimize performance, reading more on the Data Management page.


Datasets Runtime Attributes

The Datasets Namespace exposes properties and methods from the .NET objects used by the Historian Module execution. You can use these properties and methods on your Displays or to create Scripts and Alarms. The Dataset Runtime Attributes page lists all options available.


Preventing SQL Injections

Security Measures: Preventing SQL Injection

Preventing SQL Injection Attacks

When calling Stored Procedures in the database, there's a significant concern about SQL injection, especially if when passing parameters are passed as plain text in SQL statements. To mitigate this, we utilize the .Net API, where parameters are added to a list, effectively preventing code injection. Therefore, we advise using parameters in Stored Procedure calls with the following syntax:

Code Block
"
execute <StoredProcedureName> <@paramName1>={Tag.Tag1}, <@paramName2>="{Tag.TagStr}", <@paramName3>={Tag.OutValue} OUTPUT, ... , @RETURN_VALUE={Tag.RetValue} RETURNVALUE
"

This approach parameterizes Stored Procedure calls and guards against SQL injection attacks. Each parameter value is strictly interpreted as a value and not part of the SQL command, providing a critical layer of security against unauthorized database manipulation.


Network

GatewayAccess and

Gateway Access And Time Zone Handling

Using ServerIP for Network Gateway Database Access

In some scenarios, certain specific databases might not be accessible from all computers due to network topology or security constrainsconstraints. The ServerIP column can redirect database access commands to a machine with the necessary permissions, provided that our software is installed you install our platform on that machine. In such cases, only Only TWebServer would be active in such cases, handling database interaction, receiving commands, and establishing connections, and so forth. This method offers a workaround for situations where database access is restricted to specific machines.

Handling Time Zones

All Date/Time type tags in our software are treated The platform treats all Date and Time type Tags as UTC. Consequently, the dates written to and read from the standard database, such as historical data and alarms, are also in UTC. If it's necessary to read /and write dates in the local timezone, account for the time difference when preparing the SQL statement or using the WhereCondition  property. Other external data tables, from third parties or the client, can use any kind of timezone. To assist in handling these data, you can configure the "DateTimeMode" DateTimeMode column with the type of timezone used in the database (UTC or LocalTimeLocal Time), and the system will automatically convert it when reading and writing to tagsTags.


Dataset Module Performance

The Dataset module's performance depends on many factors, including database performance, network latency, and the complexity of the executing SQL queries being executed. The software itself is designed to platform will minimize overhead and execute queries as efficiently as possible. However, but ultimately, the performance of the Dataset module is tied to these external factors. It's important essential to design your database schema and queries with performance in mind , and consider investing in high-quality hardware and network infrastructure to ensure optimal performance.


Error Handling

Error handling in the Dataset module is straightforward. If Suppose an error occurs during the execution of a command, . The error message will update the module's Error property (LastStatus) will be updated with the error message. This property can be monitored Last Status). You can monitor this property to handle errors in your application. Furthermore, if suppose an error occurs during the execution of a synchronous method. In that case, the method process will return an empty DataTable Data Table, and update the Error property will be updated or . Alternatively, you can call methods like SelectCommandWithStatus, where the status will be an output parameter in the method.

These are some of the key points you should keep in mind while using the Dataset module. It's a powerful tool that provides a lot of much flexibility. However, but it also requires careful use and understanding of its features and limitations. Always keep best practices in mind when designing and implementing your database interactions, and don't hesitate . Feel free to seek expert advice if you 're unsure about need clarification on anything.


Backup of Project SQLite Data Bases

There are several ways to backup SQLite databases in a project. One simple way is to make a copy of the database file itself, but this method may not be suitable if the database is actively being written to or is very large. Here are a few suggestions for backing up SQLite databases in your project:

  • SQLite
backup command: SQLite
  • provides a built-in backup command that can be used to create a backup of a database file. The syntax for this command is as follows:
"
  • Code Block
    sqlite3 source_database.sqlite .
backup backup
  • backup backup_file.sqlite
"
  • This command creates a backup of the source database and stores it in a new file called backup_file.sqlite. You can then copy this file to a safe location for backup purposes.

  • SQLite

Online Backup API: SQLite
  • also provides an Online Backup API that you can

be used
  • use to create a backup of a database file while

it
  • your application is still

being used by your application
  • using it. This API provides several benefits over the backup command, such as the ability to perform incremental backups and

the ability to
  • monitor the progress of the backup process.


Here is a link to Online : SQLite Backup API
  • .

Regardless of the your chosen method you choose, it's important crucial to ensure that store your backup files are stored in a secure location and that they are tested test them regularly to ensure that they you can be used use them to restore your database if needed.


In this section...

Page Tree
root@parent
spacesV10