Easy Heading MacroheadingIndent 40 navigationTitle On this page selector h2,h3 wrapNavigationText true navigationExpandOption disable-expand-collapse
headingIndent | 40 |
---|---|
navigationTitle | On this page |
selector | h2,h3 |
wrapNavigationText | true |
navigationExpandOption | disable-expand-collapse |
This page contains the Technical Article: Generations of Technology in Industrial Automation software.
Industrial Automation Software
Supervisory and control system technology has evolved throughout over the years, creating several generations of software tools and automation products. A generation means represents an evolutionary step and a new platform, with encompassing a total change complete overhaul of the programming methods, user interfaces, and paradigms, going beyond merely . This goes beyond mere incremental improvements performed made during the maintenance life cycle lifecycle of products; a new generation means renewing entails a renewal of the internal architecture.
It While it is simple straightforward to identify those these evolutionary steps when making a through historical analysis, but not so easy to have that picture it is less clear when we are in the middle midst of one transition — like what is happening such a transition—like the one occurring right now. Similar to the transition from the VAX/VMS platform to the PC platform in the 80s1980s, or the change shift from DOS to Windows in the 90s1990s, we are now transitioning to currently moving towards a new generation of supervisory systems and industrial management software tools.
Intrinsically Safe Security
One feature that remains unchanged is operational stability as the main primary requirement. The mechanisms related to increasing the guarantee of stability are among the main architectural changes made possible by new technologies.
In the field of instrumentation, security is not solely guaranteed by internal procedures or manufacturers' warranty, warranties but also — also— and primarily — primarily— by the system architecture, using voltages and currents which that are "intrinsically safe" in the environment where the instrumentation will operate.
The same concept applies to software. The previous generation of technology used C/C++, pointers, several modules sharing the same memory area, and direct access to hardware and to the operating system resources. These methods, and necessary procedures vis-à-vis computers and languages available while necessary at the time. However, we consider these to be , are considered intrinsically unsafe.
Improvements
The new generation of software uses computational environments, such as the .NET Framework or JAVAJava, where processes are natively isolated between themselves from each other and from the operating system, regardless of the programmer, allowing better use . This isolation allows for better utilization of computers with multiple processor cores and ensuring better ensures greater operational stability, even in the face of potential drivers driver and hardware errors or failures on in individual system modules of the system.
Code Validation
Another change in course aiming for enhanced to enhance safety is the replacement of the Scripting scripting languages used on the software tools for project customization. The previous generation used relied on proprietary scripts or interpreted languages, such as VBScript, VBA, or proprietary expressions expression editors; the . The new generation relies on uses more modern and compiled languages like C# and VB.NET, with which offer object orientation and more projection on the improved execution control.
With interpreted languages, users cannot do a perform complete code validation during the development stages; the final verification occurs only occurs when execution passes by the code , which is executed. This means many problems issues are only possible to test when running the projectidentified during runtime, not during the engineering configuration.
For example, using variables without initialization, typeserrors such as uninitialized variables, type mismatches, and inconsistent parameters , those errors on interpreted languages are only identified detected during the execution. In addition, to increase the efficiency of the project development, the main reason this concept is so important is to ensure operational safety.
A typical project may have hundreds to thousands of possible execution paths for the code; , and testing scenarios cannot test exhaustively cover all those paths by exhaustion running all the possible casesthese paths. Therefore, the ability to detect potential errors during the engineering , and the ability to recover and isolate the errors during runtime are key elements crucial for safety and operational stability, which . These capabilities are only possible achievable by migrating the from legacy interpreted scripts to the new modern compiled and managed languages.
Complete Project Life Cycle
Another concept of this new generation of supervisory systems is the focus on the entire project cycle, not just on the software tool itself, but providing . This approach provides resources for all project phases, which includesincluding: initial engineering specifications, project configuration, testing, field installation, and maintenance.
Each project phase has its own requirements, and new software platforms must provide offer tools to help with support each of themthese phases effectively.
Technological Update
64-bit architecture, hardware acceleration, multi-touch, computers with multiple CPU cores, .NET Framework, C# and VB.NET languages, cloud computing, and graphical hardware acceleration ; these are just some of the technologies that were not available in when the emergence of the internal architecture of previous generations of supervisory systems was developed.
Although it is possible to perform some degree of improvement can be achieved through upgrades and conversions, the full use of available technology usually demands fully leveraging these technologies typically requires a core design and architecture designed conceived from its inception with the full knowledge the start with a complete understanding of available resources and the functional requirements.
Being able to open two projects simultaneously, automatically track configuration changes, allow remote access on the web by several multiple engineers at the same time to the same projects, and select displays by "preview" of the image rather than the by name are common functions on features in current text editors , but frequently they are not incorporated in the previous often missing in earlier generations of automation tools; some of them are not even running in may even lack 64-bit modesupport.
Many features have a tied connection with are closely tied to the technology and system architecture; therefore. Therefore, they are incorporated more effectively in incorporated into a new design and a new generation of the product; in general, the tendency to add more . Generally, adding advanced features on top of a core product created built with old outdated technology is too expensivecostly, not reliableunreliable, only partially implemented, and sometimes not even possible feasible at all.
Typical Components of Real-Time and Industrial Automation Systems | ||
Item | NEW GENERATION | LEGACY TECHNOLOGIES |
Internal Programming | C#/VB.NET/Java — Memory management is automatic, protected, and has greater hardware independence and operating system protection. | C++/C — Extensive use of pointers, required validation for each device, direct access to hardware and operating system. |
Graphics Technology | WPF, XBAP, Silverlight, and XAML — Independent from the resolution (vector) and uses hardware acceleration. Higher performance, native capacity for 3D and multi-touch. | GDI/GDI+ — Pixel-oriented, depends on the resolution of your monitor, distorted in conversion, less use of graphics hardware, limitations of dynamic animations. |
Web client Technology | Native Web browser without elevation or extra facilities. | Active installations, upgrading, and need for security. |
Vista Client Technology | WCF communication, standardized protocols, centralized installation, and hot swap on the server. | Communication via proprietary protocols, installation on each client machine, and no hot-swappable versions of a project. |
Editing and project execution | Multi-user with editing and execution of multiple concurrent projects. | Single-user and mono project. |
Remote access engineering | Native, multi-project, multi-user supported VPN environments and Cloud computing. | Use only in VPN through external utilities. Single-user normally. |
Data model and Tag types | Data types reflect the models of processes, such as engine, valve, and their properties. | Data Types reflect the memory of field equipment, such as byte, word, signed, and unsigned. |
Remote access to Runtime | Smart-client technology with centralized installation on the server or the WEB or Cloud, without additional components installation. Standardized and secure protocols, such as WCF. | Local installation is required for clients and WEB clients. Dedicated protocols with a frequent need to free firewall ports. |
Traceability of version control and configuration | Client-server architecture, SQL, and databases centered with native traceability project versions and settings. | Architecture in multiple files, configurations, and owners. Traceability is performed manually or through external programs. |
Functional modules and scripts at runtime | Native Multiple processes and threads. Each module and script execution thread .NET is protected natively from others. Architecture designed for effective use of multi-core processors. Exception control and memory protection are performed by the operating environment. | Single process multi-threaded or manually programmed logical and sequential Execution of unified environment. Insulation of modules, parallel execution of scripts, and protection exceptions, when they existed, they were performed through dedicated programming with a higher level of complexity. |
Scripts | Compiled (VB.NET/C#) Implementation of logic 10 x 40 times faster than an interpreted script or owner. Performs more checks during configuration, is multi-threaded, and handles exceptions, ensuring isolation of errors and increased performance. Full access to all functions in .NET Framework. | Interpreted (VBA/VBScript or logical and mathematical proprietary) Because they are interpreted, detecting many errors is possible only when you run the system. Most were mono-thread, meaning slower functions or possible compromise of system errors. Sometimes with limited access to Windows functions. |
Native platforms | 64-bit native. Support for 32-bit. Better usage of hardware and more compatibility. The system was designed originally for 64-bit and to use components already present in the operating system. | 32-bit native. Support for 64-bit. The 64-bit support is not possible, or where it exists, requires the installation of many additional components not native to the operating system. |
Communication Drivers | Parallel execution with a capacity for multiple connections for each node. Automatic statistics, Diagnostics, Redundancy, syntax validation addresses field, integration of defining tags with the PLC, multiport serial multi-protocol support, remote servers, and pickup are regular functions. | Serial communication of network stations and only a TCP/IP connection to each node. Automatic statistics, Diagnostics, redundancy, and other features mentioned were only partially available on some systems and were not yet the default minimum and regular systems. |
History | Archiving to SQL with search optimizations, compression, and management of daylight and time zone. | Owner history or archiving to SQL without optimizations. Common problems of daylight or access in different time zones. |
Data exchange | Web Services, SOAP, XML, SQL queries.DDE, text files/CSV, COM, and DCOM. | DDE, text files/CSV, COM, and DCOM. |
Alarms and Events | Distributed, with high flexibility. | Centralized, standardized targeting. |
Hot swap projects | Enables online configuration. | Enables online configuration, but usually not allowed hot-swap version running project |
New Design and Cloud Computing
Two themes that deserve their own article — consequently, article—so we will discuss only touch on them briefly here — are here—are the new Design design concepts for user interfaces and cloud computing.
Many years ago, there was already the concept of a tablet device existed, but it was only with the advent of the iPad that this technology was largely widely adopted; the . The major differential differentiator was the "Design". Much more than the simplified concept of appearance, Design determines the usability and the way to interact ." Beyond merely aesthetics, design encompasses usability and interaction with the system.
The new generation of automation software also brings the evolution of Design, not only the advances design, focusing not just on appearance but also better on improving the usability of the configuration tools and projects. In the same way that changing Just as the transition from DOS to Windows changed the transformed user interaction with programs, in this transition now, the current shift from Windows to the .NET Framework , there are also new User Interface paradigms to adopt, which bring the opportunity for configuration and programming interfaces to be introduces new user interface paradigms. These advancements offer opportunities for more intuitive, productive, immersive, with more validation and why not, also more aesthetic. and aesthetically pleasing configuration and programming interfaces.
Regarding cloud computing, while it As for Cloud Computing, it is clear that it will not replace field control systems in the field, but it brings introduces new features , for both engineering and configuration. During the execution, there are now cloud computing enables safer and more easily programmable interfaces to implement the distribution of for distributing real-time data to clients outside the corporate firewall, whether WEB such as web clients or smartphones devices. During the For project configuration and engineering phase, the gain is the ability to provide , cloud computing facilitates easy collaboration, allowing distributed teams to work together more effectively.
Previously, in order to allow the engineers had to exchange project information , the method was to exchange via emails with pictures, or FTP all the project files, or plan a trip; with the trips. With cloud computing resources for collaborative distributed engineering, various teams in different locations can interact in real time, sharing the configuration, development, and verification of the same project , with the security of secure access and traceability of the traceable modifications.
New Software Tools
In some companies, at the same time, it is a standard procedure for corporate IT to perform regular updates of their software systems. Many industrial systems are relegated and still use the same software tools from previous decades. Among several factors, some automation software was too tied to other automation elements, so the cost-benefit of upgrading to get marginal gains was not enough to justify the investment. This scenario also significantly changed due to the transition to this new generation of industrial automation systems.
The new technologies enable much more effective connectivity to legacy systems. Thus, replacing the control level to evolve your operator interfaces and adding more powerful management software is unnecessary. There are concrete and measurable gains, especially in security and flexibility, even keeping the field control systems with the old components and evolving the HMI or MES level. If your current system is based on legacy technologies, the most appropriate time to start planning to adopt new systems is exactly now, when the previous systems are still operable, not when their limitations due to old technologies raise rise to the point of becoming your bottleneck in reliability, flexibility, or evolution of the whole industrial process.
But upgrading to the latest version number of the same software tool is not enough if that product was not created with the latest technologies. The use of current data migration techniques is very straightforward in changing your project configuration and your data from any legacy system to the new ones created on top of more updated technologies.
Finally, another important reason for this transition to new-generation software tools is that the measurement of the gains in reliability, security, flexibility, and functionality are not marginal percentages , but multiplicative factors. The adoption of the new systems has a clear ROI, ensuring longevity and security for the facilities: the real-time graphical application managing a process is the front - end and visible link to the large investment on in the industrial assets monitored. Therefore, therefore leveraging the full advantages of a new software enable to get enables getting more from that whole system, what which easily justify to justifies the adoption of the most current technologies for that front - end.
In this section...
Page Tree | ||||
---|---|---|---|---|
|