It is a fact that much of the time spent on analytical laboratory instrumentation these days goes into system maintenance. Digital transformation could give us more time again for creativity and our actual laboratory work – if we shape it the right way.
The first thing we should do is integrate our laboratory equipment in a simple yet secure way without the usual conflicts between equipment manufacturers and in-house IT departments. Software development and maintenance has become unmanageable and costly for instrument manufacturers due to the extremely complex IT landscape and to poorly programmed development environments and operating systems. As a consequence, they now prefer to deliver their own computers along with their own software, recommending customers to refrain from installing patches and virus scanners. However, this calls in-house IT and quality managers into action who want to ensure compliance with the IT and quality management guidelines of the vendor and the customer. These two worlds are typically very difficult to reconcile.
Software patches and security updates are important for security reasons, but these days they regularly pose a risk to a running system or the last validated status. Instrument computers that are not integrated into a network or are insufficiently protected tempt operators to use USB sticks. User roles often cannot be implemented professionally or are too complex for day-to-day operations. The seemingly perpetual spiral of operating system updates or the replacement of defective hardware components pose significant challenges and tend to be costly in every respect. Very often this results in independently configured solutions under the radar, which is no good at all.
Then there is the handling of data. There are too many proprietary formats and disparate data standards, which almost always require manual data pre-treatment steps such as format and character conversions as well as copying activities. These cost operators a lot of time and make it difficult to observe good scientific and laboratory practices.
Things were better in the old days?
Fig. 1 A staff member uses the Beckman Microlab 620 MX infrared spectrophotometer. Picture taken between 1978 and 1987 [1].
So what do we need instead? I fondly remember my PhD time when I was working with laboratory instruments that were just evolving from purely manual operation via knobs and buttons to software control, and which displayed their parameters conspicuously both in the software and on the instrument itself (Fig. 1). All these were also accessible via an RS-232 interface – which was not very fast, but good enough. It was often possible to obtain the complete interface assignment from the manufacturer at no extra cost. My first Pascal program had everything you needed: the available settings for method development and what you needed to set up a measurement series. Even for a complete novice like me, programming only took about two weeks, despite the mockery of my colleagues that I would still be at it ten years later. At the beginning of each measurement, the computer sent the method out to the instrument, a UV/VIS/NIR spectrometer, and received the data back during the measurement along with a parting “handshake”. The program managed to measure the samples, blank values and reference spectra. That was all I needed and it allowed me to focus on my experimental work.
I gained further digital experience with VARIAN NMR spectrometers. The choice of UNIX/Linux as the operating system had proved to be the right one: Firstly, it was far enough away from the in-house IT system. Secondly, it was secure. And thirdly, the computer hardware requirements were not difficult to fulfill. The script language “MAGICAL”, explained in its entirety on only 25 pages of the manual [2], was good enough to program very simple automation of measurement procedures, intermediate calculations, graphical displays and data handling with just a sequence of console and UNIX commands. These macros still run twenty years later.
One last and very good example, again involving an NIR spectrometer: the software was available in two versions, one with a graphical user interface (GUI) to develop methods and analyze data, the other as a runtime environment (Dynamic Link Library, DLL) with lower memory requirements and faster performance. The DLL could (and still can) be run on a virtual computer cluster, so there was no need for a laboratory computer. This made it possible to create greatly simplified user interfaces that demanded very little of the operator. These could be based on script languages, at that time usually Visual Basic or VB.net.
All good things come in threes
Why is all this worth mentioning? Because these systems were intuitive: all instrument settings were available at all times, like the old buttons, and limited to the necessary level. And because they were modular: method out, data returned, data evaluation in a program of choice, open standards. And also because they were not tied to a specific setup: method development and method execution could run on any hardware and software, and can do so even virtually.
Keep it simple
Information technology churns out ever better and faster computer hardware and display technology, and thus more and more possibilities to guide the user. The once revolutionary menu navigation (“Windows”) nowadays tempts programmers to hide functions. I have experienced that some device settings can no longer be called up from the menu at all. They only appear on the screen for a short while when a new method is saved and can only be changed at this point. Even in a not particularly complex laboratory with only a few instruments you can lose sight of the bigger picture – and lose control.
Fig. 2 Node-RED displaying sensor-detected LED light intensity, tank levels and vibration. Node-RED is a programming tool that can be used to connect hardware devices, APIs and online services in new and interesting ways. Screenshot BAM: https://nodered.org
Digitization promises to improve human-machine interfaces. Well-programmed apps are based on intuitive and manageable graphical user interfaces, with no need to work through endless menus (see visualization example with Node-RED, Fig. 2). You are offered only one or a few options at a time. Information constraints alone make them intuitive, even without any standardization. This is the reason why so many private apps have been so successful. Standardization is a different option, useful in places where things simply have to remain complex.
Plug it in – ready!
At this point, it is worth taking a look at the automation technology that the process industry uses. Like in the lab, the instrumentation has grown over time and is now quite similar in terms of the variety of field devices (sensors and actuators), the interfaces and the proprietary formats. In combination with simplified instrument communication (see below), the “Module Type Package” MTP standard is making headway [3]. It encompasses a multisupplier and functional description of process module automation, for example for the integration into process control systems. The standard was developed in order to integrate complex, self-contained units (modules) into an automation environment, e.g. a process control system, by means of simple drag and drop. The MTP file contains not only the interface assignments, but also the parameterization and calibration of the module. Workflows or measurement methods can also be transmitted, together with the necessary user roles and hierarchies. During the integration process it is possible to automatically transmit all other necessary information, such as circuit and operating diagrams, instructions for operation, maintenance and repair, as well as videos or certificates. A channel for data transfer is also opened. MTP encapsulates all settings that are too complex for the user, although this has made these functions more complex than they were for my UV/VIS/NIR spectrometer all those years ago.
As a possible exemplification of Industry 4.0, digital transformation in the process industry is also yielding production concepts, in particular for so-called modular production [4], from which the laboratory can benefit as well. The modularization of production plants increases the flexibility, availability and utilization of these plants, because modules for dosing, reaction, quality control or other purposes can be reconfigured and optimized for a different product within a short period of time. This allows different products to be brought to market much more quickly to fulfill the ever increasing customer demands for new special products. The modular concept reduces complexity by encapsulating the process engineering functions. Production complexity remains or may even continue to increase, but is confined to the inside of the module.
Can’t touch this
In business, virtualized software applications and operating systems have been in use for a long time, especially for servers. They are created by placing a virtualization layer between the hardware (e.g. office computers or computer clusters in the computing center) and their logical components, i.e. the operating system or an application, so that the logical components are no longer dependent on the hardware. The result is a virtual machine. The software no longer runs locally on a single computer, but is distributed over computer networks – and even over the internet as so-called virtual appliances. It can no longer crash or burn down [5].
The major advantages are that templates can be used or virtualized applications cloned fast. This paves the way for new backup possibilities, while adhering to compliance guidelines. Recovering guest operating systems is facilitated and simplified considerably. After a software patch or firmware update, functional tests can be carried out first, to test the validity of the system. A validated state of any virtual system can be recovered within minutes by reconstituting the last validated version should this become necessary. For today's lab instrument computers this would never be possible without completely reinstalling the system and updating it to the last validated state. Virtualization is therefore a very simple way to maintain compliance, even in a GMP environment. It is also possible to temporarily run a parallel test system for development, simulation, training or similar purposes or to clone a validated system. By the way, old operating systems can still run well-secured in the cloud, even the old WIN 3.1 system of years gone by. This is investment protection at its best.
Conclusion
Digital transformation in the laboratory finally gives us the time for creative ideas. Creativity helps people to find solutions for increasingly complex challenges. Simplifying the integration and interaction of laboratory equipment and infrastructure would be a good first step. Intuitive menu navigation makes things easier, even without any standards. In addition, a virtual software environment can make existing systems sustainable and safe to operate and thus eliminate the need for local computer hardware. A change of mind is underway among instrument and software manufacturers, who increasingly support open standards again. Open standards do not lead to losses of market share, but open up new markets and create the space for further joint developments. Flagship projects with users can boost the trend towards a common and open sharing culture.
___________________________________________________________________________________________
Category: Laboratory Management | Smart Lab
Literature:
[1] Historische Sammlung Beckman, Kasten 82. Wissenschaftsgeschichtliches Institut. Philadelphia. https://digital.sciencehistory.org/works/df65v7966
[2] MAGICAL (MAGnetics Instrument Control and Analysis Language), VNMR User Programming VNMR 6.1C Software, Varian, Palo Alto, CA, USA, 2000 – Nachfolge-Software https://openvnmrj.org/about/
[3] ZVEI - Zentralverband Elektrotechnik- und Elektronikindustrie e. V., White Paper „Modulbasierte Produktion in den Prozessindustrien – Auswirkungen auf die Automation, Empfehlungen des AK Modulare Automation zur NE 148 der Namur“, Februar 2015, https://www.zvei.org/presse-medien/publikationen/white-paper-modulbasierte-produktion-in-der-prozessindustrie/
[4] VDI/VDE/NAMUR-Richtlinie 2658 „Automatisierungstechnik von modularen Anlagen in der Prozessindustrie“, https://www.vdi.de/richtlinien?tx_vdiguidelines_guidelinelist%5Bfilter%5D%5BsearchTerm%5D=2658&cHash=210809745cf93c76cff6ec835851d5ed
[5] Maiwald, M., Voll integrierte und vernetzte Systeme und Prozesse - Perspektive: Smarte Sensorik, Aktorik und Kommunikation, ATP Magazin 60(10), 70–85, 2018, DOI: 10.17560/atp.v60i10.2376
Header image: iStock.com | SolStock
Date of publication:
01-Apr-2020