A program can operate in the real-time mode if the delays that may result from signal transmission and processing, and also the internal delays of the computer system, are shorter than the allowed maximal time uncertainty in the cooperative performance of probing, data acquisition, processing and control.
Operating systems on personal computers do not care much about time uncertainties smaller than few milliseconds, because their main care is to grant all the simultaneously running programs sufficient time to do their tasks by a single processor, so that different programs appear as they were run simultaneously. However, even short delays can make problems, when a program has to interact with an instrument in certain periods, especially when the schedule is undefined in beforehand.
In the old days, when processors where slow, the only way to conciliate the demands of virtual instruments and the operating system was to give the virtual instrument and the operating system different processors for their exclusive control. The most popular system of virtual instruments - LabView from National Instruments uses the specialised expensive hardware with dedicated processors for the implementation of real-time tasks. Real-time task implementation with the dedicated processor facilitates the development of virtual instruments, when the task is not critically demanding in the processor throughput. However, the specialisation is achieved at a high cost.
The problem is not only in the excessively high costs of the interfaces with dedicated processors. There is also a fundamental problem, related to the efficiency of the hardware management in implementing tasks that require fast processing.
Because of growing demand in fast CPU for personal computers, the CPUs become faster and cheaper. The production of interface boards based on dedicated processors cannot compete the CPUs in this rush. About five years ago, when we designed real-time virtual instruments for dynamic impedance spectroscopies, personal computers were upgrading from Pentium II to Pentium III, while the fastest ADC/DAC boards for LabView were still using processors of the 486 series. The CPU remains the fastest processor in a personal computer, though dedicated processors in ADC/DAC boards also make progress. The CPU high throughput justifies usage of a single processor both for instrument control and interaction with user. Low-level synchronisation of data acquisition, processing and control enables immediate processing and real-time representation of the analysed data in chemical experiment, using just a native fast processor of a personal computer.
Direct hardware control by the CPU gives double benefit to a computer-controlled chemical experiment. The real-time instrument communicates with an object through cheap ADC/DAC converters, instead of expensive dedicated real-time interfaces, and the CPU latencies become shorter. The direct hardware control may seem to be a return to an old-time single-task operating environment, but actually the priority of the CPU usage by the hard real-time virtual instrument is higher then of the operating system just in time-critical stages. Other tasks can still run in parallel with the hard real-time routine, provided they do not attempt to control the schedule of the CPU operation.
See our articles for the illustration of the benefits of this approach in electrochemical computer-controlled systems that provide extensive real-time data processing with a microsecond accuracy of data acquisition, processing and control synchronisation in common multitasking environment.
to Virtual InstrumentsPhysico-Chemical Research Institute Belarusian State University |