Multi-Core Virtualization Changes Process Controllers

“An automation system at the level of the process controller should have three things,” says Casey Weltzin, product manager for LabView Real‑Time at automation and test vendor National Instruments Corp. (www.ni.com), in Austin, Texas.

Aw 2177 1002 Tech
“First, it should be able to interpretsensor dataand direct actuator behavior in real time. Second, it should have anHMI(human‑machine interface) so that the operator can monitor and adjust the production process. And third, it should be able to network the localprocess controlactivity to a higher-domain system such as a DCS (Distributed Control System) or aSCADA(Supervisory Control And Data Acquisition) system.

“To physically realize a system for an application demanding high performance,” says Weltzin, “current design practice would probably call for three controllers: one for the sensor, one for the actuator and one to handle the HMI and network traffic. The first two controllers would run real‑time operating systems (RTOSs). The third controller, for the HMI and networking, would run a general purpose operating system (GPOS), usually Windows or Linux.

“But now, by taking advantage ofvirtualizationon a multi‑core processor,” says Weltzin, “three controllers are no longer necessary. Both of the RTOSs, as well as the GPOS, can run on one controller. The operating systems can run on separate cores, which means lower hardware cost, smaller physical footprint and simpler manufacturing. The RTOSs and GPOS are effectively isolated, and all deliver the desired performance.” Furthermore, if properly implemented, the new multi‑core system with virtualization will be able to run proven legacy control software with few, if any, changes. This is an important consideration when upgrading certified processes.

All virtualization is abstraction that uses software to create an isolated, functioning duplicate of a computer system component. Practically any system component can be—and has been—virtualized: disk drives, servers, operating systems and networks. “The potential benefits of virtualization are well established,” says Weltzin. “But the real question for the engineer is, ‘How do we implement this thing?’ ”

“在一个自动化系统process-control level, virtualization means running multiple operating systems on a single computer, with at least one of those operating systems being an RTOS,” continues Weltzin. “The single computer on which everything runs is the ‘host.’ The software that provides the virtualization functionality is a VMM (virtual machine monitor, or virtual machine manager). Each of the operating systems runs as part of a ‘virtual machine’ (VM) on the host, and each VM runs application software, just as it normally would on a separate computer.”

“The VMM serves as a translating layer between the host and the virtual machines,” Weltzin explains. “In broad terms, it can be implemented in two ways. One way is that a ‘hosted VMM’ sits on top of the host’s operating system. The VM accesses hardware through the VMM, and the VMM makes calls to the host operating system, which, in turn, accesses the hardware. In a second way, a ‘bare metal VMM’ sits on top of the processor, with no intervening operating system. The VM still accesses hardware through the VMM, but then it proceeds directly to the hardware. To run an RTOS effectively, virtualization must be accomplished using a bare metal VMM.”

Just starting

“Virtualization is not new,” says Weltzin. “It’s a developed technology in many applications. But the use of virtualization in process control is really just beginning. As multi‑core processors become standard—not only duals and quads, but processors with 16 and 32 cores—people will have to make good use of those cores to stay competitive. Virtualization is a great way to do that.”

Marty Weil, martyweil@charter.net, is anAutomation WorldContributing Writer.

National Instruments Corp.
www.ni.com

Subscribe to Automation World's RSS Feeds for Columns & Departments

More in Control