Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.


HTML
<style>
.text-span-6 {
    background-image: linear-gradient(99deg, rgba(170, 163, 239, .5), rgba(125, 203, 207, .5));
    border-radius: 50px;
    padding-left: 15px;
    padding-right: 15px;
}

#title-text {
display: none;
}

.panelgradient {
    background-image: linear-gradient(180deg, #d5def0, whitesmoke);
    border-radius: 8px;
    flex-direction: column;
    justify-content: center;
    align-items: center;
    padding: 4rem;
    display: flex;
    position: relative;
}

</style>


<div class ="panelgradient">

<h1 style="text-align: center;">Datasets <br> (Databases and SQL Queries)</h1>

</div>



Introduction to the Datasets Module

Image Added

The Datasets Module is designed for data exchange with SQL databases and text files from a diverse set of sources. Essentially, the Datasets Module drives bi-directional real-time communication between all modules and the SQL databases.

This module offers compatibility with various database technologies, including ADO.NET, ODBC, OleDB, and native interfaces, providing straightforward configuration with prominent databases such as SQL Server, Oracle, SQLite, and PostgreSQL. Features include:

  • Multi-threaded concurrent connections with multiple databases for efficient data handling
  • SQL Query editor, SQLite admin tool, and a Visual Query Builder, streamlining the configuration experience
  • Customization of SQL statements in real-time with tags and system events

Image Added

On this page:

Table of Contents
maxLevel3
minLevel2
stylenone


Key Concepts and Terms

DatasetDB

Connections settings created by the Dataset Module to communicate with an external database.

DatasetQuery

Logical name associated with the configuration for SQL query statements with a Database, and its properties and methods for running que queries.  

DatasetTable

Logical name created to hold configuration settings to access specific tables in a connected database, mapping tags to table columns for operations.

DatasetFile

Logical name defining parameters for reading and writing files in ASCII, Unicode, or XML formats.


Understanding the Datasets Module

The Datasets Module enables users to interact with SQL databases seamlessly. The module supports real-time Tags within SQL statements, and manages files and recipes in ASCII, Unicode, or XML formats.

The data retrieved from databases can be utilized in various ways throughout your solution. For example:

  • In the Displays Module: Visualization tools like DataGrids can present query results on screens and dashboards, creating custom views of the data that are accessible and easy to understand for your users.
  • In the Scripting Module: Custom scripts can reference query results and trigger specific actions, such as sending notifications, updating tags, or performing calculations, thereby implementing complex logic based on database data.
  • Devices:  Sending data from field equipments to a SQL database, or applying settings from the database to the field equipments.

Pre-defined Database Connections

The Dataset Module also serves as a data storage configuration hub for other modules. The following Database connections are pre-defined by the Dataset Module.

  • AlarmHistorian: Events and records for long-term retention.
  • TagHistorian: Time-series storage for process variables,
  • RuntimeUsers: Dynamics users and credentials created when running the solution.
  • Retentive: Persistent records for tags and properties that need to be kept across multiple starts of the solution (typically configuration settings and setpoints).

Processing Data Requests

The Datasets Module has its implementation running as a service, which ensures high performance and real-time responses to multiple client requests.

This architecture also enhances protection and security for the database, as client displays and scripts won't access the databases directly, but through the Datasets Service.

Another benefit is the ability for Data Source Virtualization, meaning that when the solution is using Dataset.Query.Query1 in its displays or scripts, the database running that query, along with the query itself, can be maintained or replaced without affecting the overall solution configuration. This feature allows the solution to work with the data, regardless of the underlying data storage technology.

For an advanced deeper understanding of the Datasets Services, see Dataset Advanced Topics.


Configuring the Datasets Module

Configuration Workflow

The typical configuration workflow for the Dataset module has the following sequence:

Datasets Module Configuration Workflow

Action

Where 

Comments

Define database connections

Datasets→DBs

Gather connection details for your applications databases and created DB objects as need. Leverage the built-in SQLite admin tool for temporary development purposes

Prepare Queries

Datasets→Queries

DataExplorer→SQL

VisualQueryBuilder

Queries to craft queries using the built-in SQL Language Editor, the  VisualQueryBuilder or using provided SQL statements from other sources.

Fine-tune queries adding real-time parameters. Eg.: Transform "WHERE col1 = 5" to "WHERE col1 = {{tag.Test}}".

Map Database Tables

Datasets→Tables

Optionally, you can establish a direct mapping to tables within the Database. 

Map Recipes and Text files

Datasets→ Files

Optionally, your solution may need to save or load recipes, or other information, from ASCII, Unicode, or XML files. 

Managing DB Connections

There are four database connections pre-defined in any new solution.

Datasets DB - Pre-defined database connections

DB

Database

Path Location

Usage

Retentive

SQLite

<ProjectNameAndPath>.dbRetentive

Stores values for the Tags with the Retentive property set.

RuntimeUsers

SQLite

 <ProjectNameAndPath>.dbRuntimeUsers

Stores dynamically created Solution SecurityUsers.

AlarmHistorian

SQLite

 <ProjectNameAndPath>.dbAlarmHistorian

Stores Alarm and AuditTrail records.

TagHistorian

SQLite

<ProjectNameAndPath>.dbTagHistorian

Stores Tag Historian and Annotations.


When using SQLite databases, the Dataset Module can automatically create the database locally if it doesn't already exist. For other database types, the database itself must already exist before you set your connection.

→ Read more about Datasets DBs.

DatasetQueries Configuration

Use the DatasetQueries to define SQL Statements, for queries and stored procedures to execute in connection of the created DatasetDB databases. 

Read more about Datasets Queries

DatasetTables Configuration

Use the DatasetTables to access or exchange data with databases tables, with simplified query syntax. It allows allow insert new rows directly on database tables.

Read more about Datasets Tables.

DatasetFiles Configuration

The DatasetFiles are used to customize file interactions in the Dataset Module. With this feature you can read or write realtime tags to ASCII, Unicode and XAML files.

Read more about Datasets Files.


Working with the Datasets Module

Runtime Execution

When executing the solution, there is an infrastructure of services that manages access to the database and transports that information to where it is requested. For instance, to display a Dataset Query result on an HTML5 page, that request first goes to the server, which then requests the database (which can be on another computer), and the information flows back to the requester.

As database operations can take some time to execute, it is very important to understand some aspects of the Datasets Module execution, including the concept of synchronous vs. asynchronous requests.

The page Datasets Module Execution details concepts that describe the module's internal operations.

Showing DataGrids Tables on Displays

One typical use of the Dataset Module is to display query results on displays and dashboards.

In order to so, create or DatasetQuery, or DatasetTable, then use the DataGrid Control on your displays.

Using Query Results on Scripts and Tags

It's possible to define the SQL statements with code (either using the Scripts Module or Display CodeBehind) and connect the results with tags early on.

The property Dataset.Query.Query1.SqlStatament holds the query that will be executed; just modify that property within your scripts. 

The Tag Type DATATABLE was created to be compatible with results of Select() statements. SImply apply the results to your query and use tags no manage the information. 

The TK (Toolkit extension for Scripts) has methods that allow for easy copying between DataTables (query results) and Template Tags, like TK.CopyTagToDataTable().

Monitoring Databases Connection Status

Monitoring Database Connections is an essential aspect of maintaining a reliable operation of the solution. 

This can be accomplished using the Dataset Namespace properties, which provide status for DatasetTables and DatasetQueries operations.

Read more about Datasets Runtime Attributes.

During the Development phase, when the Designer tools is connected with a Runtime (the Solution is in execution), the main status conditions can be seen in the monitoring page.

Read more about Datasets Monitor


Datasets Advanced Topics

Datasets Module Execution

The Dataset module facilitates efficient database interactions by utilizing TServer services, managing synchronous and asynchronous executions for optimal performance.

Read more about Databases Connection And Interactions.

Data Management

The Dataset Module offers versatile methods for managing data and concurrency within solutions, including Data Table tags and Async Contents.

Read more about Data Management.

Datasets Runtime Attributes

The Datasets Namespace exposes properties and methods from the .NET objects used by the Historian Module execution. You can use these properties and methods on your Displays or to create Scripts and Alarms.

Read more about Datasets Runtime Attributes.

Preventing SQL Injections

See page Datasets Advanced Topics

Network Gateway Access And Time Zone Handling

See page Datasets Advanced Topics

Backup Of Solutions SQLite Data Bases

See page Datasets Advanced Topics


Anchor
BestPractices
BestPractices
Best Practices and Troubleshooting

Common Issues and Solutions

Connection loss between project and database

Database Timeout Configuration: The database may have a timeout setting that automatically disconnects idle connections after a certain period. It's recommended to check the database's timeout setting and adjust it, if necessary, to ensure that the connection remains active overnight.

Power Settings: It's also suggested to check the computer's power settings to ensure that it doesn't enter sleep or hibernation mode during idle moments, which could cause a loss of connection to the database. Adjusting these settings to keep the computer active during these idle moments may resolve the issue.

Database Connection Problem

In the DB configuration, there is always a "Test" button to ensure that the connection is happening correctly. When there is a problem, the return of this button is an error message, usually returned by the database provider itself. The most common errors are: invalid user, invalid password, computer without access to the database, incorrect form of authentication.

Issue: Error accessing the Database Table

Once the connection is established, the Table configuration is specific to a table. In the "Table" combobox, the list of available tables automatically appears. It is possible, via script, to change which table will be accessed. However, care must be taken that the table exists and that the configuration is done using the correct name. The same care must be taken when Queries are used, as it is the user's responsibility to type the correct table name, as well as the syntax of the separators.

Error in the Syntax of the Query

It is the user's responsibility to type the correct SQLStatement of a query using the QueryBuilder. Table name, column, values, all can generate an error if used incorrectly. For example: comparing different types may not return the expected result, strings in general should be in single quotes. The separators and clauses available can vary between databases. For example:

SQLServer

Code Block
languagesql
titleQuery syntax
SELECT TOP 10 * FROM table WHERE column = value

SQLite

Code Block
languagesql
titleQuery syntax
SELECT * FROM table WHERE column = value LIMIT 10;

Oracle

Code Block
languagesql
titleQuery syntax
SELECT * FROM table WHERE column = value AND ROWNUM <= 10;

or new Oracle version

Code Block
languagesql
titleQuery syntax
SELECT * FROM table WHERE column = value FETCH FIRST 10 ROWS ONLY;

IBM DB2

Code Block
languagesql
titleQuery syntax
SELECT * FROM table WHERE column = value FETCH FIRST 10 ROWS ONLY;


ServerIP without TWebServer Running on the Remote Machine

In some cases, the computer may not have access to the database. In this case, it is possible to create a gateway, routing the commands to be executed on the computer that has access to the database. The ServerIP field should be configured with the IP and port (<IP>:<port>), pointing to this computer that has access permission. This computer must have the software with the TWebServer running installed. It will automatically perform this gateway service and send the commands to the database.

DataTable Returned NULL

When a query is returned null, some error has occurred. Some common errors include: connection failure with the database, table not found, Dataset module is not running, incorrect query syntax. Check the return of the method using WithStatus when using a synchronous method or use the LastStatus and LastStatusMessage property when using asynchronous mode.

DataTable Returned with 0 Rows

When this happens, in general, there is a connection with the database and the table name is correct. In this case, the search condition is usually wrong, or the table is really empty. Check if the column names are correct and if the separators and clauses are valid.

Dataset Module is Down

Although the TServer is responsible for forwarding requests to the database, the management and communication with the TServer is done by the Dataset module, as well as the treatment of responses. Therefore, if you are having basic problems with access and execution of database access, the first thing to check is whether the module is set up to run and is actually running.

Very High Response Time

Sometimes, it may seem that access to the database is not being made, but what might actually be happening is that some accesses can return a very large amount of data, or the database may be overloaded, or with a bad configuration, causing it to have a low performance. Or even, the network itself can be overloaded and slow, and all these factors can impact the response time. In these cases, it is important to execute the Query directly in the database environment to be sure that the problem is not on the side of the database. Do this and check how long the database itself takes to execute the query. It is also worth checking the volume of data exchanged to have an idea of the related side effects.

Update of a table with the wrong schema (select before update)

The Dataset module uses ADO technology, and many things are resolved at the level of this API. When we are going to perform an Update on a table, the schema of the table and controls in the .NET DataTable type are used. Therefore, if you are going to perform an update passing a Tag or .NET DataTable object as a parameter, it is important that this object respects the schema of the destination Table in the database. Normally, a Select command must have been given at some point to obtain the correct schema used by the bank. After this, it is easy to add, remove, and modify values in this DataTable and update it back to the physical table in the database.

Where condition CaseSensitive

Case sensitivity in a WHERE clause depends on the database and the configuration of the database you are using. For example, in MySQL, queries are case-insensitive by default, which means 'abc' and 'ABC' would be considered equal. However, this can be changed with specific database settings. In SQL Server, case sensitivity is also determined by the database configuration. In PostgreSQL, queries are case-sensitive by default, so 'abc' and 'ABC' would be considered different. Therefore, it really depends on the specific database and the settings of that database. If you need to ensure case-insensitivity in a query, you can use functions like UPPER() or LOWER() to convert all values to upper or lower case before comparison. For example:

Code Block
languagesql
titleQuery syntax
SELECT * FROM table WHERE LOWER(column) = LOWER(value);

This query will return records where the column matches the value, regardless of capitalization.

Performance

The Dataset module's performance depends on many factors, including database performance, network latency, and the complexity of executing SQL queries. The platform will minimize overhead and execute queries as efficiently as possible. However, ultimately, the performance of the Dataset module is tied to these external factors. It's essential to design your database schema and queries with performance in mind and consider investing in high-quality hardware and network infrastructure to ensure optimal performance.

Best Practices and Recommendations

Error Handling

Error handling in the Dataset module is straightforward. If an error occurs during the execution of a command, the error message will update the module's Error property (Last Status). You can monitor this property to handle errors in your application. Furthermore, if an error occurs during the execution of a synchronous method, the process will return an empty DataTable and update the Error property. Alternatively, you can call methods like SelectCommandWithStatus, where the status will be an output parameter in the method.


In this section:

Easy Heading Macro
headingIndent40
navigationTitleOn this page
selectorh2,h3
wrapNavigationTexttrue
navigationExpandOptiondisable-expand-collapse
This section presents information about Datasets and SQL.

What is the Dataset Module

The Dataset Module enables connecting to an existing external database.

Different providers can be used — such as SQL Server, Oracle, SQLite, PostgreSQL, and others —, and can be simply and quickly configured.  

Image Removed Image RemovedImage RemovedImage Removed

The Dataset Module has many features specifically created for real-time applications.

  • Allows concurrent connections, using multi-threading, with many data sources.
  • Supports various database technologies, including ADO.NET, ODBC,  OleDB, and native interfaces with key Databases in the market.
  • Built-in editor for SQLite. Easily create SQLite databases using macros like <ProjectName>-Test.db.
  • Visual Query Builder to create and edit queries.
  • Easily adds real-time tags embedded in the query strings.
  • Manages file and recipes in ASCII, Unicode or XML files.
  • Built-in Networking Gateway feature, allows to safely cross security zones.
  • Virtualizes the names for queries and tables, creating application that are agnostic to the storage location. 

Key Concepts

Dataset DBs

To each external database the Module Database will communicate, a connection needs to be created with certain parameters. Each connection, created on Edit → Datasets → DBs is called in this Module as Dataset DB.

Dataset Queries

In the context of this module, when we refer to a Dataset Query, we mean not only the SQL query string, but the Project object that has a logical name, the SQL query related to that logical name, and other parameters as defined on Edit → Datasets → Queries. There are many ways to automatically map the result of query execution with Tags. 

Dataset Tables

Similar to the queries, a Dataset Table refers to a logical name, created in the project, to setup the access to a specific Table in a connected database. The tables in use are listed on Edit → Datasets → Tables. The Tags in the real-time database can easily be mapped to columns in the tables to insert, update or read operations.

Dataset Files

A Dataset File is a logical name that defines parameter to read and write from files in ASCII, Unicode or XML formats.

Configuration Workflow 

The typical configuration workflow for the Dataset Module has the following sequence:

Dataset Module Configuration Workflow

Action

Where 

Comments

Create the required database connections (DBs)Datasets → DBs

Collect the information to connect with the databases required to your Project. Use the built-in SQLite database as a temporary development tool if one of your connected database is not available yet.

The virtualization model with logical names for queries and tables will make your project work directly with the new connection with the production database, without having to change anything on the Project Configuration other than that database connection,

Prepare the Queries the Project usesDatasets → QueriesEither using the Visual Query Editor, or getting the query string from IT or plant facilitator, collect and create the logical names Dataset.Query to identify those queries.Modify the Query to add real-time tagsDatasets → Queries

Easily modify the query with the parameters that need be connected with real-time values. For instance, a query that has the text.   WHERE col1 = 5     can be modified to   WHERE col1 = {{tag.Test}}. The value of the Tag will be added to proper position when the query is executed. 

Prepare the Tables the Project usesDatasets → TablesWhen you need to Insert or Modify data, you need to access the Database Table directly. In some cases, all the information you need is one table, so there is no needing to create a Query. You can easily connect the contents what are inserted in the table with Tags in the Project. Configure the Stored ProceduresDatasets → QueriesThe Module Database can execute Stored Procedures; just define it using the same interface for the queries.Configure data exchange with FilesDatasets → FilesIf necessary to exchange values of Tags with plain text or XML files, set that configuration.Use your Dataset logical objectsAll ProjectThe logical object names created for Queries, Tables and Files can be used in any part of the project. Examples: Script calculation, Display visualization, and others 

Creating DB Connections

When using SQLite databases, the Module Dataset can automatically create the Database if necessary; for other ones, the Database itself must already exist before you set your connection. 

Users with any Permission groups can create new connections in the Project, but only the Administrator can configure databases password logins. See Security and Users for information on Project permissions.

To create a new Database connection:

  • Go to Edit → DatasetsDBs.
  • Click Create New. The Create New Database Connection window displays.
  • Enter or select information, as needed.

  • Click OK. The database is added as a new row in the table.

  • Edit the row fields to modify the required settings.

Dataset DB Configuration Properties 

Column

Description

Name

Enter a name for the database configuration. The system allows you to know if the name is not valid.

Provider

Identifies the Provider technology used in this connection 

Database

Identifies to which type of dataset is this connection

ConnectionString

Enter the information needed to connect with the database. You use macros on the connection string.  

Example: for the filename in a SQLite connection string, use <ProjectName> that is replaced by the name of the project

The available macros on connection string are:  <<<<<. fill >>>>>>>>

LogonName

Enter a valid login name for the database.

LogonPassword

Enter the password that corresponds to the database login. (Only accessible by Administrators)

ServerIP

Optionally, an IP or DNS name for a computer to be used as a Secure Gateway. More information on that at <<< link >>>>>

Description

Enter a description for the database connection.

Customizing Pre-defined Databases

There are four database connection already created in any new Project:

Datasets DB - Pre-defined database connections

DB

Database

Path Location

Usage

Retentive

SQLite

<ProjectNameAndPath>.dbRetentive

Stores values for Retentive Tags.

RuntimeUsers

SQLite

 <ProjectNameAndPath>.dbRuntimeUsers

Stores dynamically created Users.

AlarmHistorian

SQLite

 <ProjectNameAndPath>.dbAlarmHistorian

Stores Alarm and AuditTrail records.

TagHistorian

SQLite

<ProjectNameAndPath>.dbTagHistorian

Stores Tag Historian and Annotations.

Any of the them can be customized to any type of database. 

The selection of best storage location depends on all kind of factors, from internal company procedures to the volume of data and how the data shall be used. Therefore, that is decision to each Project according to its requirements.

If needed to use another database for the pre-defined connections, execute the following steps:

  • Rename or Delete the previous DB. This step is necessary, as the system would not allow to create two objects with the same name. 
  • Crate a new DB with the same name of the previous DB, with the required Database and connection strings.
  • That is all!

ConnectionString example for SQL Express 

Image Removed

  • Data Source: The server path and instance that will have the databases.
  • Initial Catalog: The name of the database that will be used.

Additional Settings for Tag Historian and AlarmHistorian 

  • Store and Forward: Enabling this option will cause the system to store the data locally if communication with the database is lost, and forward the data to synchronize once the connection is back again. For more configuration about Store and Forward, check the section Archiving Process at Tag Historian Module.

Project Test Databases

As explained on Running Projects document, it is possible to put a Project configuration to execute either in Test Mode or Startup Mode.

The Test Mode is an execution environment specifically created to simply develop and test of Projects.

When running the Project in Test Mode, there is a configuration, which is true by default, that can override the connection of the pre-defined DB, using testing ones instead.

Those are database files that can be enabled to use when running in Test Mode:

Database files used when running in Test Mode

DB

Database

Path Location

Usage

Retentive

SQLite

<ProjectNameAndPath>.dbRetentiveTest

Stores values for Retentive Tags.

RuntimeUsers

SQLite

 <ProjectNameAndPath>.dbRuntimeUsersTest

Stores dynamically created Users.

AlarmHistorian

SQLite

 <ProjectNameAndPath>.dbAlarmHistorianTest

Stores Alarm and AuditTrail records.

TagHistorian

SQLite

<ProjectNameAndPath>.dbTagHistorianTest

Stores Tag Historian and Annotations.

Use Case

Let us say you replace the AlarmHistorian configuration to Microsoft SQL production database in the company. 

When Developing and Testing the application you do not want to publish Alarm events in that database yet. 

With other platforms, you have to keep changing manually the connections from Test to Production, or embed workarounds to deal with it.

In our framework, that is an optional built-in feature: define the AlarmHistorian DB to point to Production database, event that is not available yet or that you do not want to use yet, and Run the Project in Test Mode, storing data in the local SQLite file <projectName>.dbAlarmHistorianTest

Dataset Tables Configuration

To configure database tables:

  • Go to Edit → DatasetsTables.
  • Enter the field values as needed.

Dataset Table Configuration Properties  

Field / Column

Description

Name

Enter a name for the table configuration. The system lets you know if the name is not valid.

DB

Select the database connecton.

TableName

Select or type the table name in the Database you want to access

WhereCondition

Specify the parameters that will filter the data using SQL syntax. E.g. "ColumnName = {tag.tagInt}"

Access

Select the access permissions for the table.

Mapping

Click "..." to select the tags that you want to populate with data in the first row of the table with data from specific columns.

MappingDateTime

Select the time reference (UTC or Local).

Description

Enter a description for the table configuration.

Dataset Queries Configuration

You can configure queries to perform more advanced functions with SQL statements to work with data from external databases.

To configure Dataset queries:

  • Go to Edit → DatasetsQueries.
  • Enter the field values as needed.

Dataset Query Configuration Properties  

Column

Description

Name

Enter a name for the query. The system allows you to know if the name is not valid.

DB

Select the database configuration.

SqlStatement

Enter the query using SQL syntax.

Mapping

Click "..." to select the tags that you want to populate with data from specific columns returned by the query.

MappingDateTime

Select the time reference (UTC or Local).

Description

Enter a description for the table configuration.

Dataset Files Configuration

To configure dataset files:

  • Go to Edit → DatasetsFiles.
  • Enter the field values as needed.

Dataset File Configuration Properties

Column

Description

Name

Enter a name for the file configuration. The system allows you to know if the name is not valid.

FileName

Enter the full path to the file. The file path can have Tag values embedded using curly brackets syntax. E.g.: ExampleFile{{tag.Test}}.txt

When executing, the area in curly brackets is replaced by the value of the Tag.

FileType

Select the type of file.

Objects

Click "..." to select the tags that you want to populate with data from the file with data from specific columns.

Description

Enter a description for the file configuration.

XMLSchemaType

Represents the schema type of an XML file, which can be: a TagList, XML that contains a tag list with the tag name and tag value; or a TagObject, XML that contains the entire tag tree and its children.

Using Dataset Objects

The created Dataset object, Dataset Tables, Dataset Queries, Dataset Files, can be used by the other Modules in any part of your Project. 

Some examples:

DataGrid on Displays

When used in conjunction with graphical Displays, the execution of the queries are automatic. For instance, when showing data on a DataGrid, it will execute the query automatically when opening the display. 

More information about this at DataGrid component.

Saving the result of a query to a Tag or .NET variable 

Let us say your application has a Tag of type DataTable named Test, a Dataset Query with name Query1, you can populate that Tag executing:

Code Block
@Tag.Test = @Dataset.Query.Query1.SelectComand()

Executing Stored Procedures 

Both Queries and Stored Procedure are define at the Edit → Datasets → Queries table.

In order to execute the Stores Procedure, use the ExecuteCommand() method.

E.g.: Dataset.Queries.Query1.ExecuteCommand()

When passing parameters, you can use the syntax @null@ to pass a null as parameter. See example below:

Code Block
Exec zTestOutput @return_Value = {Tag.ReturnValue} RETURN_VALUE, @vcrPrefix = @null@, @intNextNumber = {Tag.NextNumber} OUTPUT, @vcrFullLicense = {Tag.NextLicense} OUTPUT 

The Dataset Namespace

The Dataset namespace exposes properties and methods of the .NET objects used by the Dataset Module execution.

For more information on namespaces and objects, go to Objects and Namespaces.

This section describes only some commonly used properties. For the full list properties and methods, go to the Namespaces Reference.

Examples:

Dataset Module Properties examples

Property

Type

Description

Dataset.IsStarted

Boolean

Flag indicating if the Module Dataset has started.

Dataset.OpenStatusMessage

String

Message OK or error when initiating the Module.

Dataset.Query.Query1.SelectCommand()

DataTable

Executes the Query1 return a DataTable object the values.

Dataset Objects methods examplesMethodReturnTypeDescriptionDataset.Query.Query1.SelectCommand()DataTableExecutes the Query1 return a DataTable object the values.Dataset.Table.Table1.SelectCommand()DataTableExecutes a Select command on the Dataset Table Table1.Dataset.File.File1.SaveCommand()IntegerSave the file configured at Edit→ Dataset → Files→ File1.In this section...

Page Tree
root@self
spacesV10

...