Easy Heading MacroheadingIndent 40 navigationTitle On this page selector h2,h3 wrapNavigationText true navigationExpandOption disable-expand-collapse
headingIndent | 40 |
---|---|
navigationTitle | On this page |
selector | h2,h3 |
wrapNavigationText | true |
navigationExpandOption | disable-expand-collapse |
This page contains the Technical Article: Generations of Technology in Industrial Automation software.
Generations of Technology inIndustrial Automation Software
The 1980s brought the early adoption of the PC platform in automation systems and the first large projects, using the PCs to communicate with PLCs. In the following decades, supervisory and control systems technology evolvedSupervisory and control system technology has evolved over the years, creating several generations of software tools and automation products. A generation means represents an evolutionary step and a new platform, with encompassing a total change complete overhaul of the programming methods, user interfaces, and paradigms, going beyond than the merely . This goes beyond mere incremental improvements that are made during the maintenance life cycle lifecycle of products; a new generation means renewing entails a renewal of the internal architecture.
It While it is simple straightforward to identify those these evolutionary steps when making through historical analysis, but it is not so easy to have that picture less clear when we are in the middle midst of one transition — like what is happening such a transition—like the one occurring right now. Similar to the transition from the VAX/VMS platform to the PC platform in the 80s1980s, or changing the shift from DOS to Windows in the 90s, or the transition from CD media to MP3 in the past decade, we are now in the transition to 1990s, we are currently moving towards a new generation of supervisory systems and industrial management software tools.Among the factors that led to the advent of the new generation of systems, there are:
- The increase of speed on communication networks and processing capacity.
- Higher integration between the shop floor and the corporate environment.
- Distributed access and the consequent demand for increased security.
- New computing environments such as .NET Framework.
- New programming languages like C# and VB.NET.
- Cloud computing and the service as the core product.
- Collaborative and remote engineering.
- New User Interface paradigms and Design concepts, influenced by Apple.
- New forms of interaction, such as tablets, smartphones and 3D models.
- Use of open standards for storage, data exchange, graphical files.
- Greater integration of control systems with supervisory systems.
- Smart-grid deployment in power management.
- More customized production, with shorter production cycles.
- Production chains distributed in different suppliers and locations.
The migration through this transition is not achieved with minor improvements over existing platforms; it also requires new platform architectures, new software kernels, new concepts to be embraced and a new generation of technologies to implement them. In this article, we will explore these new concepts and technologies, the new user interfaces and communication, safety requirements and changes in corporate environment; factors that shaped the creation of a whole new generation of real-time distributed software for supervisory and control applications.
Intrinsically Safe Security
Intrinsically Safe Security Applied to SoftwareOne feature that remains unchanged is the operational stability as the main primary requirement. The mechanisms related to increase increasing the guarantee of stability are among the main architectural changes made possible by new technologies.
In the field of instrumentation, security is not solely guaranteed by internal procedures or manufacturers' warranty, warranties but also also— and primarily primarily— by the system architecture, using voltages and currents which that are "intrinsically safe" in the environment that where the instrumentation will operate.
The same concept applies to software. The previous generation of technology used C/C++, pointers, several modules sharing the same memory area, and direct access to hardware and to the operating system resources. These methods, necessary procedures vis-à-vis computers and languages available while necessary at the time. However, we consider these to be , are considered intrinsically unsafe.
Improvements
The new generation of software uses computational environments, such as the .NET Framework or JAVAJava, where processes are natively isolated between themselves from each other and from the operating system, regardless of the programmer, allowing better use . This isolation allows for better utilization of computers with multiple processor cores and ensuring ensures greater operational stability, even in the face of potential drivers driver and hardware errors or failures on in individual system modules of the system.
Code Validation
Another change in course aiming enhanced to enhance safety is the replacement of the Scripting scripting languages used on the software tools for project customization. The previous generation used relied on proprietary scripts or interpreted languages, such as VBScript, VBA, or proprietary expressions expression editors; the . The new generation relies on uses more modern and compiled languages such as C # like C# and VB.NET, with which offer object - orientation and more projection on the improved execution control.
With interpreted languages, you users cannot do a perform complete code validation during the development stages, ; final verification occurs only when execution passes by the code that the final verification is performed, which is executed. This means many problems issues are only possible to test when running the projectidentified during runtime, not during the engineering configuration. For example, using variables without initialization, types errors such as uninitialized variables, type mismatches, and inconsistent parameters , those errors on interpreted languages are only identified detected during the execution.
In addition, to increase the efficiency of the project development, the main reason why this concept is so important is to ensure operational safety. A typical project may have hundreds to thousands of possible execution paths for the code, the and testing scenarios cannot test exhaustively cover all those paths by exhaustion running all the possible casesthese paths. Therefore, the ability to detect potential errors during the engineering and the ability to recover and isolate the errors during runtime are key elements crucial for safety and operational stability, which . These capabilities are only possible achievable by migrating the from legacy interpreted scripts to the new modern compiled and managed languages.
Complete Project Life Cycle
Another concept of this new generation of supervisory systems is the focus on the full entire project cycle, not just on the software tool itself, but providing . This approach provides resources for all project phases, which includesincluding: initial engineering specifications, project configuration, testing, field installation, and maintenance.
Each project phase has its own requirements, and new software platforms must provide offer tools to help on support each of those these phases .
Technology selection: execution threads and module processes should be independent and isolated from each other; scripting should use modern language with compiling validation and managed execution environment, secure web clients with no need to install legacy Active-X components — which are a flaw on the network security and requires operating system privileges —, use of new technologies and standards (e.g.: WPF, WCF, XAML) and the consolidated ones (e.g.: SQL).
Project Configuration: enabled for engineering collaboration (multi-user and multi-project) using local, remote or cloud computing projects on the way; tags definition on the control systems and supervisory systems sharing a common unified list; native change management and version tracking; enhanced validation during the configuration.
Installation and pre-operation: native tools for testing, diagnostics, performance profile, project verification and publishing. Concurrent and remote access for the project configuration; project configuration to be centralized in one database file in opposition to previous generation tools where the project was split on hundreds of separate files without ensuring integrity.
Operation: ability to run testing scenarios in parallel with production on the same server; logging, historian and recipes in standard formats, such as SQL or XML, in opposition to closed systems; client server architecture, native redundancy, enhanced security system, easy integration of video, geo-information, 3D-models and remote web or tablet users.
Maintenance and evolution: Management of multiple product versions without requirement of multiple installations, complete remote access, ability to hot swap the project configuration without disconnecting operators or stopping the application; ability to run multiple concurrent projects on each server with multiple types of connected clients.
effectively.
Technological Update
Technological update and full use of the hardware64-bit architecture, hardware acceleration, multi-touch, computers with multiple CPU cores, .NET Framework, C# and VB.NET languages, cloud computing, and graphical hardware acceleration ; these are just some of the technologies that were not available when the internal architecture of previous generations of supervisory systems were createdwas developed.
Although some degree of improvement can be done achieved through upgrades and conversions, the full use of available technology usually demands fully leveraging these technologies typically requires a core design and architecture that is designed from its inception with the full knowledge conceived from the start with a complete understanding of available resources and the functional requirements.
Being able to open two projects simultaneously, automatically track configuration changes, allowing allow remote access on the web by several multiple engineers at the same time to the same projects, and selecting select displays by "preview" of the image rather than the by name are common functions on features in current text editors , but frequently they were not incorporated in the previous but are often missing in earlier generations of automation tools; some of them are not even running in may even lack 64-bit modesupport.
There are many features which have a tied connection with Many features are closely tied to the technology and system architecture; therefore. Therefore, they are more effectively incorporated in into a new design and a new generation of the product; in general, the tendency to add more . Generally, adding advanced features on top of a core product created built with old outdated technology is very expensivecostly, not reliableunreliable, only partially implemented, and sometimes not even possible feasible at all.The following table lists some typical components of real-time and industrial automation systems, and how they benefit by the adoption of the new technologies.
Typical Components of Real-Time and Industrial Automation Systems | ||||||||
Item | NEW GENERATION | LEGACY TECHNOLOGIES | ||||||
Internal Programming | C#/VB.NET/Java | — — Memory management is automatic, protected, and has greater hardware independence | of hardware and operating system protection. | C++/ | C C — Extensive use of pointers, required validation for each device, direct access to hardware and operating system. | |||
Graphics Technology | WPF, XBAP, Silverlight, and XAML | — — Independent from the resolution (vector) and uses hardware acceleration. | Greater Higher performance, native capacity for 3D and multi-touch. | GDI/GDI+ — Pixel-oriented, depends on the resolution of your monitor, distorted in conversion, less use of graphics hardware, limitations of dynamic animations. | ||||
Web client Technology | Native Web browser | , without elevation or extra facilities. | Active installations, upgrading, and need for security. | |||||
Vista Client Technology | WCF communication, standardized protocols, centralized installation, and hot swap on the server. | Communication via proprietary protocols, installation on each client machine, and no hot-swappable versions of a project. | ||||||
Editing and project execution | Multi-user | , with editing and execution of multiple concurrent projects. | Single-user and mono project. | |||||
Remote access engineering | Native, multi-project, multi-user supported VPN environments and Cloud computing. | Use only in VPN through external utilities. Single-user normally. | ||||||
Data model and Tag types | Data types reflect the models of processes, such as engine, valve, and their properties. | Data Types reflect the memory of field equipment, such as byte, word, signed, and unsigned. | ||||||
Remote access to Runtime | Smart-client technology with centralized installation on the server or the WEB or Cloud, without | installation of additional components installation. Standardized and secure protocols, such as WCF. | Local installation is required for clients and WEB clients. Dedicated protocols with a frequent need to free firewall ports. | |||||
Traceability of version control and configuration | Client-server architecture, SQL, and databases centered with native traceability project versions and settings. | Architecture in multiple files | and configuration , configurations, and owners. Traceability is performed manually or through external programs. | |||||
Functional modules and scripts at runtime | Native Multiple processes and threads. | Each Each module and script execution thread .NET is protected natively | protected from others. Architecture designed for effective use of multi-core processors. Exception control and memory protection | is done are performed by the operating environment. | Single process multi-threaded or manually programmed | logical logical and sequential Execution of unified environment. Insulation of modules, parallel execution of scripts, and protection exceptions, when | it they existed, | it was done through a they were performed through dedicated programming with a higher level of complexity. |
Scripts | Compiled (VB.NET/C#) Implementation of logic | is between 10 x | and 40 times faster than an interpreted script or owner. Performs more checks during configuration, is multi-threaded, and handles exceptions, ensuring isolation of errors and increased performance. Full access to all functions in .NET Framework. | Interpreted (VBA/VBScript or logical and mathematical proprietary) | Because Because they are interpreted, detecting many errors | are is possible only when you run the system. Most were mono-thread, meaning | that slower functions or possible compromise of system errors. Sometimes with limited access to Windows functions. | |
Native platforms | 64-bit native. Support for 32-bit. | Better Better usage of hardware and more compatibility. The system was designed originally | designed for 64-bit and to use components already present in the operating system. | 32-bit native. Support for 64-bit. | The The 64-bit support is not possible, or where it exists, requires the installation of many additional components not native to the operating system. | |||
Communication Drivers | Parallel execution with a capacity for multiple connections | to for each node. Automatic statistics, Diagnostics, Redundancy, syntax validation addresses field, integration of defining tags with the PLC, | on multiport serial multi-protocol support, remote servers, and pickup are regular functions. | Serial communication of network stations and only a TCP/IP connection to each node. Automatic statistics, Diagnostics, redundancy, and other features mentioned were only partially available on some systems | , and were not yet the default minimum and regular systems. | |||
History | Archiving to SQL with search optimizations, compression, and management of daylight and time zone. | Owner history or archiving to SQL without optimizations. Common problems of daylight or access in different time zones. | ||||||
Data exchange | Web Services, SOAP, XML, SQL queries.DDE, text files/CSV, COM, and DCOM. | DDE, text files/CSV, COM | e , and DCOM. | |||||
Alarms and Events | Distributed, with high flexibility. | Centralized, standardized targeting. | ||||||
Hot swap projects | Enables online configuration. | Enables online configuration, but | normally usually not allowed hot-swap version running project |
New
User Interfaces,Design and Cloud Computing
Two themes that deserve their own article — consequently, they will be only briefly discussed here — are the new Design article—so we will only touch on them briefly here—are the new design concepts for user interfaces and cloud computing.
Many years ago, there was already the concept of a tablet device existed, but it was only with the advent of the iPad that this technology was largely widely adopted; the . The major differential differentiator was the "Design". Much more than the simplified concept of appearance, Design determines the usability, the way to interact ." Beyond merely aesthetics, design encompasses usability and interaction with the system.
The new generation of automation software also brings the evolution of Design, not only the appearance, but also a better usability of the advances design, focusing not just on appearance but also on improving the usability of configuration tools and projects. In the same way that changing Just as the transition from DOS for to Windows changed the transformed user interaction with programs, in this transition now, the current shift from Windows to the .NET Framework , there are also new User Interface paradigms to adopt, which bring the opportunity for configuration and programming interfaces to be introduces new user interface paradigms. These advancements offer opportunities for more intuitive, productive, immersive, with more validation and why not, also and aesthetically pleasing configuration and programming interfaces.
Regarding cloud computing, while more aesthetic. As for Cloud Computing, it is clear that it will not replace field control systems in the field, but it brings introduces new features , for both the engineering and configuration. During the execution, there are now cloud computing enables safer and more easily programmable interfaces to implement the distribution of for distributing real-time data to clients outside the corporate firewall, whether WEB such as web clients or smartphones devices. During the For project configuration and engineering phase the gain is the ability to provide , cloud computing facilitates easy collaboration, allowing distributed teams to work together more effectively.
Previously, in order to allow the engineers had to exchange project information , the method was to exchange via emails with pictures, or FTP all the project files, or plan a trip; with the trips. With cloud computing resources for collaborative distributed engineering, various teams in different locations can interact in real time, sharing the configuration, development, and verification of the same project , with security of secure access and traceability of the traceable modifications.
New Software Tools
In some companies, at the same time it is a standard procedure on for corporate IT to perform regular updates of their software systems, many . Many industrial systems are relegated and keep using still use the same software tools from previous decades. Among several factors, some automation software was too tied to other automation elements of the automation, so the cost-benefit on of upgrading to get marginal gains was not enough to justify the investment. This scenario also had significantly changed due to the transition to this new generation of industrial automation systems.
The new technologies enable much more effective connectivity to legacy systems. By that, it is not necessary to replace Thus, replacing the control level to evolve your operator interfaces and to add adding more powerful management software is unnecessary. There are concrete and measurable gains, especially in security and flexibility, even keeping the field controls control systems with the old components and evolving the HMI or MES level. If your current system is still based on legacy technologies, the most appropriate time to start planning to adopt new systems is exactly now, when the previous systems are still operable, not when its their limitations due to old technologies raise rise to the point of becoming your bottleneck in reliability, flexibility, or evolution of the whole industrial process.
But just upgrading to the latest version number of the same software tool is not enough if that product was not created with the latest technologies. The use of current data migration techniques is very straightforward in changing your project configuration and your data from any legacy system to the new ones that are created on top of more updated technologies.
Finally, another important reason leading to for this transition to new-generation software tools is that the measurement of the gains in reliability, security, flexibility, and functionality are not marginal percentages , but multiplicative factors. The adoption of the new systems has a clear ROI, ensuring longevity and security for the facilities: the real-time graphical application managing a process is the front - end and visible link to the very large investment on in the industrial assets being monitored. Therefore, therefore leveraging the full advantages of a new software enable to get enables getting more from that whole system, what which easily justify to justifies the adoption of the most current technologies for that front - end.
In this section...
Page Tree | ||||
---|---|---|---|---|
|