(413a) Open Source Controls, Cloud Computing and Paradigm Changes in Laboratory-Scale Reactor Control | AIChE

(413a) Open Source Controls, Cloud Computing and Paradigm Changes in Laboratory-Scale Reactor Control

Authors 

Hartman, R., New York University
Traditionally, controlling and collecting data from a chemical reactor was a difficult task. Accurately capturing non-steady-state behavior, very fast phenomena, multiple data points from long runs has been difficult. This problem has only become more prevalent as microfluidic systems have gained popularity as a research tool. Due to their micro-to-milliliter internal volumes, microreactor systems allow for testing of different parameters much more quickly and efficiently than traditional reactors, while also offering benefits to mixing, heat transfer and the speed at which changes to the process can be made. However, there have also been innovations in computer science which open new doors and create new paradigms in process control for chemical reaction engineers, both at the macro and micro scale.

The first major recent development has been the creation and validation of low-cost sensors. Until very recently high-precision thermocouples, pressure transducers, level sensors, accelerometers, thermal cameras and other common components had to be purchased from specialty suppliers and cost a fortune to implement. Now, due in large part to the Maker movement, numerous vendors have released low cost and high quality sensors and reader boards thanks in large part to the economy of scale provided by silicon manufacturing. It has now become possible to monitor a lot more parameters on your system for the same initial investment in sensors.

The second major development which is leading to a paradigm shift in process controls is the development of SoC (System on a Chip) technologies. These chips have enabled a whole slew of low-cost single-board computer that are specialized for different tasks and often do them faster than dedicated hardware from a decade ago. On the lower end of this trend is the Arduino board, a $20 microcontroller that can input analog and digital sensor values, communicate over standard protocols like SPI, I2C, RS232 and ethernet, and can even interface with more specialized conventions like CAN Bus. Even the simple Arduino offers a lot to reaction engineers, because now in the course of a few days and engineer can set up a robust control and data collection system which before required thousands of dollars worth of hardware and software, and a software engineer to implement effectively. In the middle of this range of new controller boards are devices like the Raspberry Pi, a device with roughly the processing power of a high-end desktop from the late 90s running Linux. The Pi and similar devices offer engineers the ability to do more intensive computations and interface their experiment with a wider range of external devices and services. They also support common programming languages like Python, C++ and Java and offer numerous I/O options such as USB, ethernet, Serial along with all the aforementioned protocols, making it simple to interface with external devices such as pumps, mass flow meters, and even equipment like GCs and spectrometers. This again presents unique opportunities to the reaction engineer, because we can now create autonomous systems with a small footprint, and add computational abilities to systems it would have been impractical to automate in the past. One the very upper end of the range of controller boards are devices like the Nvidia Jetson TK2. While still reasonably prices compared to traditional desktop and laptop computers, these devices have processing capacity which would have been absurd to even consider a decade ago. A single board could deliver TFLOP/s-levels of processing power (billion floating-point arithmetic operations per second), making it possible to implement full convolutional neural networks on the bench or in the field very easily and cost-effectively. Overall all the recent developments in computing technology have made it much easier to implement robust control and data collection algorithms on even the simplest of experiments.

The final recent development which opens new doors to reaction engineers is big data and cloud computing. Many of the trends we try to find in our everyday work are obscured through highly non-linear relationships. Traditionally this required weeks or months of tedious calculations to derive governing equations from first principles. Today through the use of neural nets and big data, it has become possible to derive models in significantly less time. At the heart of this revolution lies big data, the ability to store very large amounts of sensor reading and other parameters in the cloud. For example, if you had a system with a thousand sensors, and stored each of them once a second for a year in a remote cloud solution like Amazon S3, it would cost you about $4. Thirty years ago storing the same amount of data would require a stack of floppy disks. The collection of this data enables scientists and engineers to write algorithms to search it for trends. It also becomes possible to use convolutional neural networks to develop models to fit this data and thus enhance both process control and optimization. Overall the collection and analysis of big data in chemical reaction engineering opens up new doors for deriving and validating models to describe our processes.